Selectivity assessment of DB-200 and DB-VRX open-tubular capillary columns. Journal of chromatography. A Evaluation of the separation characteristics of application-specific (pesticides and dioxins) open-tubular columns for gas chromatography. Journal of chromatography. A Evaluation of the separation characteristics of application-specific (volatile organic compounds) open-tubular columns for gas chromatography. Journal of chromatography. A Selectivity assessment of popular stationary phases for open-tubular column gas chromatography. Journal of chromatography. A An ECG storage and retrieval system embedded in client server HIS utilizing object-oriented DB. Journal of medical systems Relative risk of elevated hearing threshold compared to ISO1999 normative populations for Royal Australian Air Force male personnel. Hearing research System constants for the bis(cyanopropylsiloxane)-co-methylsilarylene HP-88 and poly(siloxane) Rtx-440 stationary phases. Journal of chromatography. A Automatic identification of various nuclei in the basal ganglia for Parkinson's disease neurosurgery. Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference MIPS Arabidopsis thaliana Database (MAtDB): an integrated biological knowledge resource based on the first complete plant genome. Nucleic acids research A computerized surveillance system for asthma. International journal of health care quality assurance TEOAE recording protocols revised: data from adult subjects. International journal of audiology MaXML: mouse annotation XML. In silico biology AHD2.0: an update version of Arabidopsis Hormone Database for plant systematic studies. Nucleic acids research ApiEST-DB: analyzing clustered EST data of the apicomplexan parasites. Nucleic acids research Energetic particle fluxes data base of "CORONAS-I" satellite observations. Advances in space research : the official journal of the Committee on Space Research (COSPAR) Automatic annotation of BIND molecular interactions from three-dimensional structures. Biopolymers Analysis of low complex region peptides derived from mollusk shell matrix proteins using CID, high-energy collisional dissociation, and electron transfer dissociation on an LTQ-orbitrap: implications for peptide to spectrum match. Proteomics BiomarkerDigger: a versatile disease proteome database and analysis platform for the identification of plasma cancer biomarkers. Proteomics Extension of the system constants database for open-tubular columns: system maps at low and intermediate temperatures for four new columns. Journal of chromatography. A CDD: a curated Entrez database of conserved domain alignments. Nucleic acids research The hepatitis C sequence database in Los Alamos. Nucleic acids research Hypothalamic expression of ART, a novel gene related to agouti, is up-regulated in obese and diabetic mutant mice. Genes & development RiDs db: Repeats in diseases database. Bioinformation A7DB: a relational database for mutational, physiological and pharmacological data related to the alpha7 nicotinic acetylcholine receptor. BMC neuroscience Assessment of the selectivity equivalence of DB-608 and DB-624 open-tubular columns for gas chromatography. Journal of separation science Entrez Gene: gene-centered information at NCBI. Nucleic acids research Organelle DB: a cross-species database of protein localization and function. Nucleic acids research Classification of protein sequences by homology modeling and quantitative analysis of electrostatic similarity. Proteins AH-DB: collecting protein structure pairs before and after binding. Nucleic acids research Discrete breathers in protein structures. Physical biology Systems pharmacology of adverse event mitigation by drug combinations. Science translational medicine Obesity and diabetes: Two-for-one strike at incretins Nature Reviews Drug Discovery A recent paper published in Science Translational Medicine has shown that an investigational diabetes drug that targets two members of the incretin family of hormone receptors could have better metabolic effects than approved diabetes drugs such as exenatide (Byetta; Amylin) and liraglutide (Victoza; Novo Nordisk). Exenatide and liraglutide are agonists at the glucagon-like peptide 1 (GLP1) receptor. Thermosensitive Hydrogel PEG–PLGA–PEG Enhances Engraftment of Muscle-derived Stem Cells and Promotes Healing in Diabetic Wound Molecular Therapy Regenerating new tissue using cell transplantation has relied on successful cell engraftment in the host; however, cell engraftment into the diabetic skin wound is not as successful as in many other tissues. We used a biodegradable and biocompatible triblock co-polymer poly(ethylene glycol-b-[dl-lactic acid-co-glycolic acid]-b-ethylene glycol) (PEG–PLGA–PEG), which forms a thermosensitive hydrogel, as a wound dressing and scaffold. We found that the thermosensitive hydrogel increased the engraftment of muscle-derived stem cells (MDSCs) by 20- to 30-fold until day 20, when the wound was completely closed in a db/db genetically diabetic mouse model. At day 9, 30% of the transplanted MDSCs were found to remain, and 15% remained at day 20 after transplantation. The increased engraftment resulted in enhanced wound healing, as indicated by the wound closure rate, epithelium migration, and collagen deposition. Using MDSCs stably expressing β-gal and immunofluorescence, we found that 25% of MDSCs differentiated into fibroblasts, 10% into myofibroblasts, and 10% into endothelial cells. We conclude that using the thermosensitive hydrogel as a scaffold increased the engraftment of MDSCs, which leads to improved diabetic wound healing, possibly by retaining the cells at the wound site for longer. Template Construction Grammar: From Visual Scene Description to Language Comprehension and Agrammatism Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Mechanisms of very fast oscillations in networks of axons coupled by gap junctions Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Abstract Because of its highly branched dendrite, the Purkinje neuron requires significant computational resources if coupled electrical and biochemical activity are to be simulated. To address this challenge, we developed a scheme for reducing the geometric complexity; while preserving the essential features of activity in both the soma and a remote dendritic spine. We merged our previously published biochemical model of calcium dynamics and lipid signaling in the Purkinje neuron, developed in the Virtual Cell modeling and simulation environment, with an electrophysiological model based on a Purkinje neuron model available in NEURON. A novel reduction method was applied to the Purkinje neuron geometry to obtain a model with fewer compartments that is tractable in Virtual Cell. Most of the dendritic tree was subject to reduction, but we retained the neuron’s explicit electrical and geometric features along a specified path from spine to soma. Further, unlike previous simplification methods, the dendrites that branch off along the preserved explicit path are retained as reduced branches. We conserved axial resistivity and adjusted passive properties and active channel conductances for the reduction in surface area, and cytosolic calcium for the reduction in volume. Rallpacks are used to validate the reduction algorithm and show that it can be generalized to other complex neuronal geometries. For the Purkinje cell, we found that current injections at the soma were able to produce similar trains of action potentials and membrane potential propagation in the full and reduced models in NEURON; the reduced model produces identical spiking patterns in NEURON and Virtual Cell. Importantly, our reduced model can simulate communication between the soma and a distal spine; an alpha function applied at the spine to represent synaptic stimulation gave similar results in the full and reduced models for potential changes associated with both the spine and the soma. Finally, we combined phosphoinositol signaling and electrophysiology in the reduced model in Virtual Cell. Thus, a strategy has been developed to combine electrophysiology and biochemistry as a step toward merging neuronal and systems biology modeling. Abstract The advent of techniques with the ability to scan massive changes in cellular makeup (genomics, proteomics, etc.) has revealed the compelling need for analytical methods to interpret and make sense of those changes. Computational models built on sound physicochemical mechanistic basis are unavoidable at the time of integrating, interpreting, and simulating highthroughput experimental data. Another powerful role of computational models is predicting new behavior provided they are adequately validated.Mitochondrial energy transduction has been traditionally studied with thermodynamic models. More recently, kinetic or thermokinetic models have been proposed, leading the path toward an understanding of the control and regulation of mitochondrial energy metabolism and its interaction with cytoplasmic and other compartments. In this work, we outline the methods, stepbystep, that should be followed to build a computational model of mitochondrial energetics in isolation or integrated to a network of cellular processes. Depending on the question addressed by the modeler, the methodology explained herein can be applied with different levels of detail, from the mitochondrial energy producing machinery in a network of cellular processes to the dynamics of a single enzyme during its catalytic cycle. Abstract The voltage and time dependence of ion channels can be regulated, notably by phosphorylation, interaction with phospholipids, and binding to auxiliary subunits. Many parameter variation studies have set conductance densities free while leaving kinetic channel properties fixed as the experimental constraints on the latter are usually better than on the former. Because individual cells can tightly regulate their ion channel properties, we suggest that kinetic parameters may be profitably set free during model optimization in order to both improve matches to data and refine kinetic parameters. To this end, we analyzed the parameter optimization of reduced models of three electrophysiologically characterized and morphologically reconstructed globus pallidus neurons. We performed two automated searches with different types of free parameters. First, conductance density parameters were set free. Even the best resulting models exhibited unavoidable problems which were due to limitations in our channel kinetics. We next set channel kinetics free for the optimized density matches and obtained significantly improved model performance. Some kinetic parameters consistently shifted to similar new values in multiple runs across three models, suggesting the possibility for tailored improvements to channel models. These results suggest that optimized channel kinetics can improve model matches to experimental voltage traces, particularly for channels characterized under different experimental conditions than recorded data to be matched by a model. The resulting shifts in channel kinetics from the original template provide valuable guidance for future experimental efforts to determine the detailed kinetics of channel isoforms and possible modulated states in particular types of neurons. Abstract Electrical synapses continuously transfer signals bidirectionally from one cell to another, directly or indirectly via intermediate cells. Electrical synapses are common in many brain structures such as the inferior olive, the subcoeruleus nucleus and the neocortex, between neurons and between glial cells. In the cortex, interneurons have been shown to be electrically coupled and proposed to participate in large, continuous cortical syncytia, as opposed to smaller spatial domains of electrically coupled cells. However, to explore the significance of these findings it is imperative to map the electrical synaptic microcircuits, in analogy with in vitro studies on monosynaptic and disynaptic chemical coupling. Since “walking” from cell to cell over large distances with a glass pipette is challenging, microinjection of (fluorescent) dyes diffusing through gapjunctions remains so far the only method available to decipher such microcircuits even though technical limitations exist. Based on circuit theory, we derive analytical descriptions of the AC electrical coupling in networks of isopotential cells. We then suggest an operative electrophysiological protocol to distinguish between direct electrical connections and connections involving one or more intermediate cells. This method allows inferring the number of intermediate cells, generalizing the conventional coupling coefficient, which provides limited information. We validate our method through computer simulations, theoretical and numerical methods and electrophysiological paired recordings. Abstract Because electrical coupling among the neurons of the brain is much faster than chemical synaptic coupling, it is natural to hypothesize that gap junctions may play a crucial role in mechanisms underlying very fast oscillations (VFOs), i.e., oscillations at more than 80 Hz. There is now a substantial body of experimental and modeling literature supporting this hypothesis. A series of modeling papers, starting with work by Roger Traub and collaborators, have suggested that VFOs may arise from expanding waves propagating through an “axonal plexus”, a large random network of electrically coupled axons. Traub et al. also proposed a cellular automaton (CA) model to study the mechanisms of VFOs in the axonal plexus. In this model, the expanding waves take the appearance of topologically circular “target patterns”. Random external stimuli initiate each wave. We therefore call this kind of VFO “externally driven”. Using a computational model, we show that an axonal plexus can also exhibit a second, distinctly different kind of VFO in a wide parameter range. These VFOs arise from activity propagating around cycles in the network. Once triggered, they persist without any source of excitation. With idealized, regular connectivity, they take the appearance of spiral waves. We call these VFOs “reentrant”. The behavior of the axonal plexus depends on the reliability with which action potentials propagate from one axon to the next, which, in turn, depends on the somatic membrane potential V s and the gap junction conductance g gj . To study these dependencies, we impose a fixed value of V s , then study the effects of varying V s and g gj . Not surprisingly, propagation becomes more reliable with rising V s and g gj . Externally driven VFOs occur when V s and g gj are so high that propagation never fails. For lower V s or g gj , propagation is nearly reliable, but fails in rare circumstances. Surprisingly, the parameter regime where this occurs is fairly large. Even a single propagation failure can trigger reentrant VFOs in this regime. Lowering V s and g gj further, one finds a third parameter regime in which propagation is unreliable, and no VFOs arise. We analyze these three parameter regimes by means of computations using model networks adapted from Traub et al., as well as much smaller model networks. The use of automated parameter searches to improve ion channel kinetics for neural modeling Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Abstract Because of its highly branched dendrite, the Purkinje neuron requires significant computational resources if coupled electrical and biochemical activity are to be simulated. To address this challenge, we developed a scheme for reducing the geometric complexity; while preserving the essential features of activity in both the soma and a remote dendritic spine. We merged our previously published biochemical model of calcium dynamics and lipid signaling in the Purkinje neuron, developed in the Virtual Cell modeling and simulation environment, with an electrophysiological model based on a Purkinje neuron model available in NEURON. A novel reduction method was applied to the Purkinje neuron geometry to obtain a model with fewer compartments that is tractable in Virtual Cell. Most of the dendritic tree was subject to reduction, but we retained the neuron’s explicit electrical and geometric features along a specified path from spine to soma. Further, unlike previous simplification methods, the dendrites that branch off along the preserved explicit path are retained as reduced branches. We conserved axial resistivity and adjusted passive properties and active channel conductances for the reduction in surface area, and cytosolic calcium for the reduction in volume. Rallpacks are used to validate the reduction algorithm and show that it can be generalized to other complex neuronal geometries. For the Purkinje cell, we found that current injections at the soma were able to produce similar trains of action potentials and membrane potential propagation in the full and reduced models in NEURON; the reduced model produces identical spiking patterns in NEURON and Virtual Cell. Importantly, our reduced model can simulate communication between the soma and a distal spine; an alpha function applied at the spine to represent synaptic stimulation gave similar results in the full and reduced models for potential changes associated with both the spine and the soma. Finally, we combined phosphoinositol signaling and electrophysiology in the reduced model in Virtual Cell. Thus, a strategy has been developed to combine electrophysiology and biochemistry as a step toward merging neuronal and systems biology modeling. Abstract The advent of techniques with the ability to scan massive changes in cellular makeup (genomics, proteomics, etc.) has revealed the compelling need for analytical methods to interpret and make sense of those changes. Computational models built on sound physicochemical mechanistic basis are unavoidable at the time of integrating, interpreting, and simulating highthroughput experimental data. Another powerful role of computational models is predicting new behavior provided they are adequately validated.Mitochondrial energy transduction has been traditionally studied with thermodynamic models. More recently, kinetic or thermokinetic models have been proposed, leading the path toward an understanding of the control and regulation of mitochondrial energy metabolism and its interaction with cytoplasmic and other compartments. In this work, we outline the methods, stepbystep, that should be followed to build a computational model of mitochondrial energetics in isolation or integrated to a network of cellular processes. Depending on the question addressed by the modeler, the methodology explained herein can be applied with different levels of detail, from the mitochondrial energy producing machinery in a network of cellular processes to the dynamics of a single enzyme during its catalytic cycle. Abstract The voltage and time dependence of ion channels can be regulated, notably by phosphorylation, interaction with phospholipids, and binding to auxiliary subunits. Many parameter variation studies have set conductance densities free while leaving kinetic channel properties fixed as the experimental constraints on the latter are usually better than on the former. Because individual cells can tightly regulate their ion channel properties, we suggest that kinetic parameters may be profitably set free during model optimization in order to both improve matches to data and refine kinetic parameters. To this end, we analyzed the parameter optimization of reduced models of three electrophysiologically characterized and morphologically reconstructed globus pallidus neurons. We performed two automated searches with different types of free parameters. First, conductance density parameters were set free. Even the best resulting models exhibited unavoidable problems which were due to limitations in our channel kinetics. We next set channel kinetics free for the optimized density matches and obtained significantly improved model performance. Some kinetic parameters consistently shifted to similar new values in multiple runs across three models, suggesting the possibility for tailored improvements to channel models. These results suggest that optimized channel kinetics can improve model matches to experimental voltage traces, particularly for channels characterized under different experimental conditions than recorded data to be matched by a model. The resulting shifts in channel kinetics from the original template provide valuable guidance for future experimental efforts to determine the detailed kinetics of channel isoforms and possible modulated states in particular types of neurons. Central synapses release a resource-efficient amount of glutamate Nature Neuroscience Why synapses release a certain amount of neurotransmitter is poorly understood. We combined patch-clamp electrophysiology with computer simulations to estimate how much glutamate is discharged at two distinct central synapses of the rat. We found that, regardless of some uncertainty over synaptic microenvironment, synapses generate the maximal current per released glutamate molecule while maximizing signal information content. Our result suggests that synapses operate on a principle of resource optimization. A cross-platform freeware tool for digital reconstruction of neuronal arborizations from image stacks Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. Attributing physical and biological impacts to anthropogenic climate change Nature Significant changes in physical and biological systems are occurring on all continents and in most oceans, with a concentration of available data in Europe and North America. Most of these changes are in the direction expected with warming temperature. Here we show that these changes in natural systems since at least 1970 are occurring in regions of observed temperature increases, and that these temperature increases at continental scales cannot be explained by natural climate variations alone. Given the conclusions from the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report that most of the observed increase in global average temperatures since the mid-twentieth century is very likely to be due to the observed increase in anthropogenic greenhouse gas concentrations, and furthermore that it is likely that there has been significant anthropogenic warming over the past 50 years averaged over each continent except Antarctica, we conclude that anthropogenic climate change is having a significant impact on physical and biological systems globally and in some continents. Stochastic ion channel gating in dendritic neurons: morphology dependence and probabilistic synaptic activation of dendritic spikes. PLoS computational biology Neuronal activity is mediated through changes in the probability of stochastic transitions between open and closed states of ion channels. While differences in morphology define neuronal cell types and may underlie neurological disorders, very little is known about influences of stochastic ion channel gating in neurons with complex morphology. We introduce and validate new computational tools that enable efficient generation and simulation of models containing stochastic ion channels distributed across dendritic and axonal membranes. Comparison of five morphologically distinct neuronal cell types reveals that when all simulated neurons contain identical densities of stochastic ion channels, the amplitude of stochastic membrane potential fluctuations differs between cell types and depends on sub-cellular location. For typical neurons, the amplitude of membrane potential fluctuations depends on channel kinetics as well as open probability. Using a detailed model of a hippocampal CA1 pyramidal neuron, we show that when intrinsic ion channels gate stochastically, the probability of initiation of dendritic or somatic spikes by dendritic synaptic input varies continuously between zero and one, whereas when ion channels gate deterministically, the probability is either zero or one. At physiological firing rates, stochastic gating of dendritic ion channels almost completely accounts for probabilistic somatic and dendritic spikes generated by the fully stochastic model. These results suggest that the consequences of stochastic ion channel gating differ globally between neuronal cell-types and locally between neuronal compartments. Whereas dendritic neurons are often assumed to behave deterministically, our simulations suggest that a direct consequence of stochastic gating of intrinsic ion channels is that spike output may instead be a probabilistic function of patterns of synaptic input to dendrites. Action Potentials;Axons;CA1 Region, Hippocampal;Dendrites;Humans;Ion Channel Gating;Ion Channels;Membrane Potentials;Models, Neurological;Stochastic Processes;Synapses Signal integration on the dendrites of a pyramidal neuron model Cognitive Neurodynamics Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Nonlinear trend estimation of the ventricular repolarization segment for T-wave alternans detection. IEEE transactions on bio-medical engineering Psychovegetative syndrome diagnosis: an automated psychophysiological investigation and mathematical modeling approach. Medinfo. MEDINFO An epidemiologic study of snoring and all-cause mortality. Otolaryngology--head and neck surgery : official journal of American Academy of Otolaryngology-Head and Neck Surgery Maximizing coverage of glycosylation heterogeneity in MALDI-MS analysis of glycoproteins with up to 27 glycosylation sites. Analytical chemistry New horizons in mouse immunoinformatics: reliable in silico prediction of mouse class I histocompatibility major complex peptide binding affinity. Organic & biomolecular chemistry A nonrandomized comparison of potassium titanyl phosphate and CO2 laser fiber stapedotomy for primary otosclerosis with the otology-neurotology database. The Laryngoscope Towards Interoperability in Genome Databases: The MAtDB (MIPS Arabidopsis Thaliana Database) Experience. Comparative and functional genomics Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology. Briefings in bioinformatics Correlating phosphatidylinositol 3-kinase inhibitor efficacy with signaling pathway status: in silico and biological evaluations. Cancer research Internet-based support for bioscience research: a collaborative genome center for human chromosome 12. Journal of the American Medical Informatics Association : JAMIA CranialVault and its CRAVE tools: a clinical computer assistance system for deep brain stimulation (DBS) therapy. Medical image analysis [QRS complex detection using empirical mode decomposition and windowing technique]. Sheng wu yi xue gong cheng xue za zhi = Journal of biomedical engineering = Shengwu yixue gongchengxue zazhi Population distribution of flexible molecules from maximum entropy analysis using different priors as background information: application to the Φ, Ψ-conformational space of the α-(1-->2)-linked mannose disaccharide present in N- and O-linked glycoproteins. Organic & biomolecular chemistry Online drug databases: a new method to assess and compare inclusion of clinically relevant information. International journal of clinical pharmacy The Los Alamos hepatitis C sequence database. Bioinformatics (Oxford, England) PHENOPSIS DB: an information system for Arabidopsis thaliana phenotypic data in an environmental context. BMC plant biology Discovery of novel focal adhesion kinase inhibitors using a hybrid protocol of virtual screening approach based on multicomplex-based pharmacophore and molecular docking. International journal of molecular sciences Scoring methods for building genotypic scores: an application to didanosine resistance in a large derivation set. PloS one Performance validation of neural network based (13)c NMR prediction using a publicly available data source. Journal of chemical information and modeling S3QL: a distributed domain specific language for controlled semantic integration of life sciences data. BMC bioinformatics A Semantic Web management model for integrative biomedical informatics. PloS one 2DB: a Proteomics database for storage, analysis, presentation, and retrieval of information from mass spectrometric experiments. BMC bioinformatics QSAR studies on HIV-1 protease inhibitors using non-linearly transformed descriptors. Current computer-aided drug design Inferring hypotheses on functional relationships of genes: Analysis of the Arabidopsis thaliana subtilase gene family. PLoS computational biology Atomic analysis of protein-protein interfaces with known inhibitors: the 2P2I database. PloS one Deep brain stimulation for epilepsy in clinical practice and in animal models. Brain research bulletin SCA db: spinocerebellar ataxia candidate gene database. Bioinformatics (Oxford, England) Performance enhancement for audio-visual speaker identification using dynamic facial muscle model. Medical & biological engineering & computing Cognitive sequelae of subthalamic nucleus deep brain stimulation in Parkinson's disease: a meta-analysis. Lancet neurology Novel implementation of conditional co-regulation by graph theory to derive co-expressed genes from microarray data. BMC bioinformatics Denoising based on spatial filtering. Journal of neuroscience methods PK/DB: database for pharmacokinetic properties and predictive in silico ADME models. Bioinformatics (Oxford, England) A database for medical image management. Computer methods and programs in biomedicine Classification of multichannel EEG patterns using parallel hidden Markov models. Medical & biological engineering & computing Filtering electrocardiographic signals using an unbiased and normalized adaptive noise reduction system. Medical engineering & physics Human ECGs corrupted with real CPR artefacts in an animal model: generating a database to evaluate and refine algorithms for eliminating CPR artefacts. Resuscitation Mining human phenome to investigate modularity of complex disorders. Summit on translational bioinformatics ThioFinder: a web-based tool for the identification of thiopeptide gene clusters in DNA sequences. PloS one Deep brain stimulation in the media: over-optimistic portrayals call for a new strategy involving journalists and scientists in ethical debates. Frontiers in integrative neuroscience Discovery of novel Bruton's tyrosine kinase inhibitors using a hybrid protocol of virtual screening approaches based on SVM model, pharmacophore and molecular docking. Chemical biology & drug design AGUIA: autonomous graphical user interface assembly for clinical trials semantic data services. BMC medical informatics and decision making [Estimation of a nationwide statistics of hernia operation applying data mining technique to the National Health Insurance Database]. Journal of preventive medicine and public health = Yebang Ŭihakhoe chi The use of Data Mining in the categorization of patients with Azoospermia. Hormones (Athens, Greece) A data model and database for high-resolution pathology analytical image informatics. Journal of pathology informatics Evolutionary constraints on structural similarity in orthologs and paralogs. Protein science : a publication of the Protein Society Long-term results in ossiculoplasty: an analysis of prognostic factors. Otology & neurotology : official publication of the American Otological Society, American Neurotology Society [and] European Academy of Otology and Neurotology Developmentally regulated Ca2+-dependent activator protein for secretion 2 (CAPS2) is involved in BDNF secretion and is associated with autism susceptibility. Cerebellum (London, England) Protein synthesis in Giardia lamblia may involve interaction between a downstream box (DB) in mRNA and an anti-DB in the 16S-like ribosomal RNA. Molecular and biochemical parasitology KTP versus CO2 laser fiber stapedotomy for primary otosclerosis: results of a new comparative series with the otology-neurotology database. Otology & neurotology : official publication of the American Otological Society, American Neurotology Society [and] European Academy of Otology and Neurotology Cube-DB: detection of functional divergence in human protein families. Nucleic acids research MMDB: annotating protein sequences with Entrez's 3D-structure database. Nucleic acids research The Star STING server: a multiplatform environment for protein structure analysis. Genetics and molecular research : GMR Targeting the DNA minor groove with fused ring dicationic compounds: comparison of in silico screening and a high-resolution crystal structure. Bioorganic & medicinal chemistry letters Large-scale modelling as a route to multiple surface comparisons of the CCP module family. Protein engineering, design & selection : PEDS Content-based ultrasound image retrieval using a coarse to fine approach. Annals of the New York Academy of Sciences Boosting color feature selection for color face recognition. IEEE transactions on image processing : a publication of the IEEE Signal Processing Society Source level estimation of two blue whale subspecies in southwestern Indian Ocean. The Journal of the Acoustical Society of America CDD: a Conserved Domain Database for protein classification. Nucleic acids research Parkinson's disease medication use and costs following deep brain stimulation. Movement disorders : official journal of the Movement Disorder Society Specification of absorbed-sound power in the ear canal: application to suppression of stimulus frequency otoacoustic emissions. The Journal of the Acoustical Society of America Genotypic resistance profiles associated with virological failure to darunavir-containing regimens: a cross-sectional analysis. Infection KineticDB: a database of protein folding kinetics. Nucleic acids research Risk factors for hearing loss in US adults: data from the National Health and Nutrition Examination Survey, 1999 to 2002. Otology & neurotology : official publication of the American Otological Society, American Neurotology Society [and] European Academy of Otology and Neurotology Evaluation of the separation characteristics of application-specific (fatty acid methyl esters) open-tubular columns for gas chromatography. Journal of separation science A wavelet based technique for suppression of EMG noise and motion artifact in ambulatory ECG. Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference Automated detection of paroxysmal atrial fibrillation from inter-heartbeat intervals. Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference The cell cycle DB: a systems biology approach to cell cycle analysis. Nucleic acids research Integrase resistance variants among integrase inhibitor treatment-naïve and treated patients from Northwestern Poland. Journal of the International AIDS Society Genome-wide detection and analysis of cell wall-bound proteins with LPxTG-like sorting motifs. Journal of bacteriology Accuracy and reliability of MRI quantitative measurements to assess spinal cord compression in cervical spondylotic myelopathy: a prospective study. Evidence-based spine-care journal Appearance of the pattern deviation map as a function of change in area of localized field loss. Investigative ophthalmology & visual science Content Based medical image retrieval based on BEMD: optimization of a similarity metric. Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference Comparative studies of transcriptional regulation mechanisms in a group of eight gamma-proteobacterial genomes. Journal of molecular biology Prescribing patterns for upper respiratory tract infections in general practice in France and in the Netherlands. European journal of public health CAPS-DB: a structural classification of helix-capping motifs. Nucleic acids research Structure-function relationships between spectral-domain OCT and standard achromatic perimetry. Investigative ophthalmology & visual science ECG denoising using angular velocity as a state and an observation in an Extended Kalman Filter framework. Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference Metformin suppresses hepatic gluconeogenesis through induction of SIRT1 and GCN5. The Journal of endocrinology Assessing the reliability of sequence similarities detected through hydrophobic cluster analysis. Proteins Telephone-quality pathological speech classification using empirical mode decomposition. Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference STRING 8--a global view on proteins and their functional interactions in 630 organisms. Nucleic acids research Hearing thresholds for U.S. Marines: comparison of aviation, combat arms, and other personnel. Aviation, space, and environmental medicine The edge-driven dual-bootstrap iterative closest point algorithm for registration of multimodal fluorescein angiogram sequence. IEEE transactions on medical imaging A hybrid approach to the simultaneous eliminating of power-line interference and associated ringing artifacts in electrocardiograms. Biomedical engineering online Exposing the cancer genome atlas as a SPARQL endpoint. Journal of biomedical informatics Phase stability of auditory steady state responses in newborn infants. Ear and hearing BALSA: Bayesian algorithm for local sequence alignment. Nucleic acids research Strand-specific RNA-seq reveals widespread occurrence of novel cis-natural antisense transcripts in rice. BMC genomics Rule mining and classification in a situation assessment application: a belief-theoretic approach for handling data imperfections. IEEE transactions on systems, man, and cybernetics. Part B, Cybernetics : a publication of the IEEE Systems, Man, and Cybernetics Society Structure-function relationships of the variable domains of monoclonal antibodies approved for cancer treatment. Critical reviews in oncology/hematology Analysis of the isolated SecA DEAD motor suggests a mechanism for chemical-mechanical coupling. Journal of molecular biology The STRING database in 2011: functional interaction networks of proteins, globally integrated and scored. Nucleic acids research [Analysis of volatile oils of Ligusticum chuanxiong Hort. from different geographical origins by comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry]. Se pu = Chinese journal of chromatography / Zhongguo hua xue hui The Rice Proteogenomics Database OryzaPG-DB: Development, Expansion, and New Features. Frontiers in plant science A spatial-temporal system for dynamic cadastral management. Journal of environmental management ProMate: a structure based prediction program to identify the location of protein-protein binding sites. Journal of molecular biology Quantification of blood vessel calibre in retinal images of multi-ethnic school children using a model based approach. Computerized medical imaging and graphics : the official journal of the Computerized Medical Imaging Society Frequency doubling technology perimetry in normal children. American journal of ophthalmology JPEG quality transcoding using neural networks trained with a perceptual error measure. Neural computation SM2PH-db: an interactive system for the integrated analysis of phenotypic consequences of missense mutations in proteins involved in human genetic diseases. Human mutation 3D QSAR on a library of heterocyclic diamidine derivatives with antiparasitic activity. Bioorganic & medicinal chemistry The favorable price evolution between bare metal stents and drug eluting stents increases the cost effectiveness of drug eluting stents. International journal of cardiology Deep brain stimulation for dystonia: a meta-analysis. Neuromodulation : journal of the International Neuromodulation Society REPROVIS-DB: a benchmark system for ligand-based virtual screening derived from reproducible prospective applications. Journal of chemical information and modeling Effects of long-term exposure to traffic-related air pollution on respiratory and cardiovascular mortality in the Netherlands: the NLCS-AIR study. Research report (Health Effects Institute) ECG denoising and compression using a modified extended Kalman filter structure. IEEE transactions on bio-medical engineering Adaptive wavelet Wiener filtering of ECG signals. IEEE transactions on bio-medical engineering STRING v9.1: protein-protein interaction networks, with increased coverage and integration. Nucleic acids research The proteome of seed development in the model legume Lotus japonicus. Plant physiology Effect of cigarette smoking on noise-induced hearing loss in workers exposed to occupational noise in China. Noise & health OMIA (Online Mendelian Inheritance in Animals): an enhanced platform and integration into the Entrez search interface at NCBI. Nucleic acids research IUPHAR-DB: new receptors and tools for easy searching and visualization of pharmacological data. Nucleic acids research METALGEN.DB: metabolism linked to the genome of Escherichia coli, a graphics-oriented database. Computer applications in the biosciences : CABIOS Constraints within major histocompatibility complex class I restricted peptides: presentation and consequences for T-cell recognition. Proceedings of the National Academy of Sciences of the United States of America Risk factors for sensorineural hearing loss in extremely premature infants. Journal of paediatrics and child health Comparison of methods for assessing the impact of different disturbances and nutrient conditions upon functional characteristics of grassland communities. Annals of botany HIV-1 integrase resistance among antiretroviral treatment naive and experienced patients from Northwestern Poland. BMC infectious diseases Subthalamic nucleus deep brain stimulation: summary and meta-analysis of outcomes. Movement disorders : official journal of the Movement Disorder Society Interspecies extrapolation based on the RepDose database--a probabilistic approach. Toxicology letters A texture-based classification of crackles and squawks using lacunarity. IEEE transactions on bio-medical engineering MMDB: Entrez's 3D-structure database. Nucleic acids research A knowledge framework for computational molecular-disease relationships in cancer. Proceedings / AMIA ... Annual Symposium. AMIA Symposium Time-frequency modelling and discrimination of noise in the electrocardiogram. Physiological measurement Occupational noise exposure assessment using O*NET and its application to a study of hearing loss in the US general population. Occupational and environmental medicine The repertoires of ubiquitinating and deubiquitinating enzymes in eukaryotic genomes. Molecular biology and evolution A single-lead ECG enhancement algorithm using a regularized data-driven filter. IEEE transactions on bio-medical engineering Use of independent component analysis for reducing CPR artefacts in human emergency ECGs. Resuscitation Differences in metabolomic profiles of male db/db and s/s, leptin receptor mutant mice. Physiological genomics Electrostatic potential calculation for biomolecules--creating a database of pre-calculated values reported on a per residue basis for all PDB protein structures. Genetics and molecular research : GMR An evaluation of rating scales utilized for deep brain stimulation for dystonia. Journal of neurology CKAAPs DB: a conserved key amino acid positions database. Nucleic acids research Differential association of beta2-microglobulin mutants with MHC class I heavy chains and structural analysis demonstrate allele-specific interactions. Molecular immunology IMGT, the international ImMunoGeneTics database. Nucleic acids research IMGT, the international ImMunoGeneTics database. Nucleic acids research ARTADE2DB: improved statistical inferences for Arabidopsis gene functions and structure predictions by dynamic structure-based dynamic expression (DSDE) analyses. Plant & cell physiology T cell receptor/peptide/MHC molecular characterization and standardized pMHC contact sites in IMGT/3Dstructure-DB. In silico biology Do patient's get angrier following STN, GPi, and thalamic deep brain stimulation. NeuroImage A neural network investigation of the crucial facets of urban sustainability. Substance use & misuse DEPPDB - DNA electrostatic potential properties database. Electrostatic properties of genome DNA elements. Journal of bioinformatics and computational biology The detection of various opiates and benzodiazepines by comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry. Rapid communications in mass spectrometry : RCM Hearing status among aircraft maintenance personnel in a commercial airline company. Noise & health Finding biomarkers in non-model species: literature mining of transcription factors involved in bovine embryo development. BioData mining Structural and functional relationships in glaucoma using standard automated perimetry and the Humphrey Matrix. Korean journal of ophthalmology : KJO The creation of a database of odorous compounds focused on molecular rigidity and analysis of the molecular features of the compounds in the database. Chemical senses Discovery of novel Pim-1 kinase inhibitors by a hierarchical multistage virtual screening approach based on SVM model, pharmacophore, and molecular docking. Journal of chemical information and modeling MR image reconstruction from highly undersampled k-space data by dictionary learning. IEEE transactions on medical imaging IMGT gene identification and Colliers de Perles of human immunoglobulins with known 3D structures. Immunogenetics An evidence ontology for use in pathway/genome databases. Pacific Symposium on Biocomputing. Pacific Symposium on Biocomputing Weight effects associated with antipsychotics: a comprehensive database analysis. Schizophrenia research Disease associated cytokine SNPs database: an annotation and dissemination model. Cytokine CSB.DB: a comprehensive systems-biology database. Bioinformatics (Oxford, England) An approach for access differentiation design in medical distributed applications built on databases. Studies in health technology and informatics FEMME database: topologic and geometric information of macromolecules. Journal of structural biology Sampling rate and the estimation of ensemble variability for repetitive signals. Medical & biological engineering & computing Neuronvisio: A Graphical User Interface with 3D Capabilities for NEURON. Frontiers in neuroinformatics The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three-dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://neuronvisio.org) aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model. Neuronvisio also facilitates access to previously published models, allowing users to browse, download, and locally run NEURON models stored in ModelDB. Neuronvisio uses the matplotlib library to plot simulation results and uses the HDF standard format to store simulation results. Neuronvisio can be viewed as an extension of NEURON, facilitating typical user workflows such as model browsing, selection, download, compilation, and simulation. The 3D viewer simplifies the exploration of complex model structure, while matplotlib permits the plotting of high-quality graphs. The newly introduced ability of saving numerical results allows users to perform additional analysis on their previous simulations. LOX-DB-- database on lipoxygenases. Bioinformatics (Oxford, England) A new atlas localization approach for subthalamic nucleus utilizing Chinese visible human head datasets. PloS one The what and where of adding channel noise to the Hodgkin-Huxley equations. PLoS computational biology Conductance-based equations for electrically active cells form one of the most widely studied mathematical frameworks in computational biology. This framework, as expressed through a set of differential equations by Hodgkin and Huxley, synthesizes the impact of ionic currents on a cell's voltage--and the highly nonlinear impact of that voltage back on the currents themselves--into the rapid push and pull of the action potential. Later studies confirmed that these cellular dynamics are orchestrated by individual ion channels, whose conformational changes regulate the conductance of each ionic current. Thus, kinetic equations familiar from physical chemistry are the natural setting for describing conductances; for small-to-moderate numbers of channels, these will predict fluctuations in conductances and stochasticity in the resulting action potentials. At first glance, the kinetic equations provide a far more complex (and higher-dimensional) description than the original Hodgkin-Huxley equations or their counterparts. This has prompted more than a decade of efforts to capture channel fluctuations with noise terms added to the equations of Hodgkin-Huxley type. Many of these approaches, while intuitively appealing, produce quantitative errors when compared to kinetic equations; others, as only very recently demonstrated, are both accurate and relatively simple. We review what works, what doesn't, and why, seeking to build a bridge to well-established results for the deterministic equations of Hodgkin-Huxley type as well as to more modern models of ion channel dynamics. As such, we hope that this review will speed emerging studies of how channel noise modulates electrophysiological dynamics and function. We supply user-friendly MATLAB simulation code of these stochastic versions of the Hodgkin-Huxley equations on the ModelDB website (accession number 138950) and http://www.amath.washington.edu/~etsb/tutorials.html. Action Potentials;Animals;Axons;Decapodiformes;Electric Conductivity;Markov Chains;Models, Neurological;Potassium Channels;Sodium Channels;Stochastic Processes MIPS Arabidopsis thaliana Database (MAtDB): an integrated biological knowledge resource for plant genomics. Nucleic acids research A novel approach to estimate trabecular bone anisotropy using a database approach. Journal of biomechanics Use of IMGT(®) databases and tools for antibody engineering and humanization. Methods in molecular biology (Clifton, N.J.) Database search based on Bayesian alignment. Proceedings / ... International Conference on Intelligent Systems for Molecular Biology ; ISMB. International Conference on Intelligent Systems for Molecular Biology CIG-DB: the database for human or mouse immunoglobulin and T cell receptor genes available for cancer studies. BMC bioinformatics A strategy for database interoperation. Journal of computational biology : a journal of computational molecular cell biology Easing the transition between attribute-value databases and conventional databases for scientific data. Proceedings / AMIA ... Annual Symposium. AMIA Symposium GiSAO.db: a database for ageing research. BMC genomics Data mining in the MetaCyc family of pathway databases. Methods in molecular biology (Clifton, N.J.) Relational database structure to manage high-density tissue microarray data and images for pathology studies focusing on clinical outcome: the prostate specialized program of research excellence model. The American journal of pathology Inferring connection proximity in networks of electrically coupled cells by subthreshold frequency response analysis Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Abstract Because of its highly branched dendrite, the Purkinje neuron requires significant computational resources if coupled electrical and biochemical activity are to be simulated. To address this challenge, we developed a scheme for reducing the geometric complexity; while preserving the essential features of activity in both the soma and a remote dendritic spine. We merged our previously published biochemical model of calcium dynamics and lipid signaling in the Purkinje neuron, developed in the Virtual Cell modeling and simulation environment, with an electrophysiological model based on a Purkinje neuron model available in NEURON. A novel reduction method was applied to the Purkinje neuron geometry to obtain a model with fewer compartments that is tractable in Virtual Cell. Most of the dendritic tree was subject to reduction, but we retained the neuron’s explicit electrical and geometric features along a specified path from spine to soma. Further, unlike previous simplification methods, the dendrites that branch off along the preserved explicit path are retained as reduced branches. We conserved axial resistivity and adjusted passive properties and active channel conductances for the reduction in surface area, and cytosolic calcium for the reduction in volume. Rallpacks are used to validate the reduction algorithm and show that it can be generalized to other complex neuronal geometries. For the Purkinje cell, we found that current injections at the soma were able to produce similar trains of action potentials and membrane potential propagation in the full and reduced models in NEURON; the reduced model produces identical spiking patterns in NEURON and Virtual Cell. Importantly, our reduced model can simulate communication between the soma and a distal spine; an alpha function applied at the spine to represent synaptic stimulation gave similar results in the full and reduced models for potential changes associated with both the spine and the soma. Finally, we combined phosphoinositol signaling and electrophysiology in the reduced model in Virtual Cell. Thus, a strategy has been developed to combine electrophysiology and biochemistry as a step toward merging neuronal and systems biology modeling. Abstract The advent of techniques with the ability to scan massive changes in cellular makeup (genomics, proteomics, etc.) has revealed the compelling need for analytical methods to interpret and make sense of those changes. Computational models built on sound physicochemical mechanistic basis are unavoidable at the time of integrating, interpreting, and simulating highthroughput experimental data. Another powerful role of computational models is predicting new behavior provided they are adequately validated.Mitochondrial energy transduction has been traditionally studied with thermodynamic models. More recently, kinetic or thermokinetic models have been proposed, leading the path toward an understanding of the control and regulation of mitochondrial energy metabolism and its interaction with cytoplasmic and other compartments. In this work, we outline the methods, stepbystep, that should be followed to build a computational model of mitochondrial energetics in isolation or integrated to a network of cellular processes. Depending on the question addressed by the modeler, the methodology explained herein can be applied with different levels of detail, from the mitochondrial energy producing machinery in a network of cellular processes to the dynamics of a single enzyme during its catalytic cycle. Abstract The voltage and time dependence of ion channels can be regulated, notably by phosphorylation, interaction with phospholipids, and binding to auxiliary subunits. Many parameter variation studies have set conductance densities free while leaving kinetic channel properties fixed as the experimental constraints on the latter are usually better than on the former. Because individual cells can tightly regulate their ion channel properties, we suggest that kinetic parameters may be profitably set free during model optimization in order to both improve matches to data and refine kinetic parameters. To this end, we analyzed the parameter optimization of reduced models of three electrophysiologically characterized and morphologically reconstructed globus pallidus neurons. We performed two automated searches with different types of free parameters. First, conductance density parameters were set free. Even the best resulting models exhibited unavoidable problems which were due to limitations in our channel kinetics. We next set channel kinetics free for the optimized density matches and obtained significantly improved model performance. Some kinetic parameters consistently shifted to similar new values in multiple runs across three models, suggesting the possibility for tailored improvements to channel models. These results suggest that optimized channel kinetics can improve model matches to experimental voltage traces, particularly for channels characterized under different experimental conditions than recorded data to be matched by a model. The resulting shifts in channel kinetics from the original template provide valuable guidance for future experimental efforts to determine the detailed kinetics of channel isoforms and possible modulated states in particular types of neurons. Abstract Electrical synapses continuously transfer signals bidirectionally from one cell to another, directly or indirectly via intermediate cells. Electrical synapses are common in many brain structures such as the inferior olive, the subcoeruleus nucleus and the neocortex, between neurons and between glial cells. In the cortex, interneurons have been shown to be electrically coupled and proposed to participate in large, continuous cortical syncytia, as opposed to smaller spatial domains of electrically coupled cells. However, to explore the significance of these findings it is imperative to map the electrical synaptic microcircuits, in analogy with in vitro studies on monosynaptic and disynaptic chemical coupling. Since “walking” from cell to cell over large distances with a glass pipette is challenging, microinjection of (fluorescent) dyes diffusing through gapjunctions remains so far the only method available to decipher such microcircuits even though technical limitations exist. Based on circuit theory, we derive analytical descriptions of the AC electrical coupling in networks of isopotential cells. We then suggest an operative electrophysiological protocol to distinguish between direct electrical connections and connections involving one or more intermediate cells. This method allows inferring the number of intermediate cells, generalizing the conventional coupling coefficient, which provides limited information. We validate our method through computer simulations, theoretical and numerical methods and electrophysiological paired recordings. bex-db: Bioinformatics workbench for comprehensive analysis of barley-expressed genes. Breeding science pE-DB: a database of structural ensembles of intrinsically disordered and of unfolded proteins. Nucleic acids research Rice DB: an Oryza Information Portal linking annotation, subcellular location, function, expression, regulation, and evolutionary information for rice and Arabidopsis. The Plant journal : for cell and molecular biology Differences in biophysical properties of nucleus accumbens medium spiny neurons emerging from inactivation of inward rectifying potassium currents Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Low-dimensional, morphologically accurate models of subthreshold membrane potential Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Dynamical changes in neurons during seizures determine tonic to clonic shift Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Abstract Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and stateoftheart Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semanticallyequivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Abstract Injury to neural tissue renders voltagegated Na + (Nav) channels leaky. Even mild axonal trauma initiates Na + loading, leading to secondary Ca 2+ loading and white matter degeneration. The nodal isoform is Nav1.6 and for Nav1.6expressing HEKcells, traumatic whole cell stretch causes an immediate tetrodotoxinsensitive Na + leak. In stretchdamaged oocyte patches, Nav1.6 current undergoes damageintensity dependent hyperpolarizing (left) shifts, but whether leftshift underlies injuredaxon Navleak is uncertain. Nav1.6 inactivation (availability) is kinetically limited by (coupled to) Nav activation, yielding coupled leftshift (CLS) of the two processes: CLS should move the steadystate Nav1.6 “window conductance” closer to typical firing thresholds. Here we simulated excitability and ion homeostasis in freerunning nodes of Ranvier to assess if hallmark injuredaxon behaviors—Na + loading, ectopic excitation, propagation block—would occur with NavCLS. Intact/traumatized axolemma ratios were varied, and for some simulations Na/K pumps were included, with varied in/outside volumes. We simulated saltatory propagation with one midaxon node variously traumatized. While dissipating the [Na + ] gradient and hyperactivating the Na/K pump, NavCLS generated neuropathic painlike ectopic bursts. Depending on CLS magnitude, fraction of Nav channels affected, and pump intensity, tonic or burst firing or nodal inexcitability occurred, with [Na + ] and [K + ] fluctuating. Severe CLSinduced inexcitability did not preclude Na + loading; in fact, the steadystate Na + leaks elicited large pump currents. At a midaxon node, mild CLS perturbed normal anterograde propagation, and severe CLS blocked saltatory propagation. These results suggest that in damaged excitable cells, NavCLS could initiate cellular deterioration with attendant hyper or hypoexcitability. Healthycell versions of NavCLS, however, could contribute to physiological rhythmic firing. Abstract Lateral inhibition of cells surrounding an excited area is a key property of sensory systems, sharpening the preferential tuning of individual cells in the presence of closely related input signals. In the olfactory pathway, a dendrodendritic synaptic microcircuit between mitral and granule cells in the olfactory bulb has been proposed to mediate this type of interaction through granule cell inhibition of surrounding mitral cells. However, it is becoming evident that odor inputs result in broad activation of the olfactory bulb with interactions that go beyond neighboring cells. Using a realistic modeling approach we show how backpropagating action potentials in the long lateral dendrites of mitral cells, together with granule cell actions on mitral cells within narrow columns forming glomerular units, can provide a mechanism to activate strong local inhibition between arbitrarily distant mitral cells. The simulations predict a new role for the dendrodendritic synapses in the multicolumnar organization of the granule cells. This new paradigm gives insight into the functional significance of the patterns of connectivity revealed by recent viral tracing studies. Together they suggest a functional wiring of the olfactory bulb that could greatly expand the computational roles of the mitral–granule cell network. Abstract Spinal motor neurons have voltage gated ion channels localized in their dendrites that generate plateau potentials. The physical separation of ion channels for spiking from plateau generating channels can result in nonlinear bistable firing patterns. The physical separation and geometry of the dendrites results in asymmetric coupling between dendrites and soma that has not been addressed in reduced models of nonlinear phenomena in motor neurons. We measured voltage attenuation properties of six anatomically reconstructed and typeidentified cat spinal motor neurons to characterize asymmetric coupling between the dendrites and soma. We showed that the voltage attenuation at any distance from the soma was directiondependent and could be described as a function of the input resistance at the soma. An analytical solution for the lumped cable parameters in a twocompartment model was derived based on this finding. This is the first twocompartment modeling approach that directly derived lumped cable parameters from the geometrical and passive electrical properties of anatomically reconstructed neurons. Abstract Models for temporary information storage in neuronal populations are dominated by mechanisms directly dependent on synaptic plasticity. There are nevertheless other mechanisms available that are well suited for creating shortterm memories. Here we present a model for working memory which relies on the modulation of the intrinsic excitability properties of neurons, instead of synaptic plasticity, to retain novel information for periods of seconds to minutes. We show that it is possible to effectively use this mechanism to store the serial order in a sequence of patterns of activity. For this we introduce a functional class of neurons, named gate interneurons, which can store information in their membrane dynamics and can literally act as gates routing the flow of activations in the principal neurons population. The presented model exhibits properties which are in close agreement with experimental results in working memory. Namely, the recall process plays an important role in stabilizing and prolonging the memory trace. This means that the stored information is correctly maintained as long as it is being used. Moreover, the working memory model is adequate for storing completely new information, in time windows compatible with the notion of “oneshot” learning (hundreds of milliseconds). Abstract For the analysis of neuronal cooperativity, simultaneously recorded extracellular signals from neighboring neurons need to be sorted reliably by a spike sorting method. Many algorithms have been developed to this end, however, to date, none of them manages to fulfill a set of demanding requirements. In particular, it is desirable to have an algorithm that operates online, detects and classifies overlapping spikes in real time, and that adapts to nonstationary data. Here, we present a combined spike detection and classification algorithm, which explicitly addresses these issues. Our approach makes use of linear filters to find a new representation of the data and to optimally enhance the signaltonoise ratio. We introduce a method called “Deconfusion” which decorrelates the filter outputs and provides source separation. Finally, a set of welldefined thresholds is applied and leads to simultaneous spike detection and spike classification. By incorporating a direct feedback, the algorithm adapts to nonstationary data and is, therefore, well suited for acute recordings. We evaluate our method on simulated and experimental data, including simultaneous intra/extracellular recordings made in slices of a rat cortex and recordings from the prefrontal cortex of awake behaving macaques. We compare the results to existing spike detection as well as spike sorting methods. We conclude that our algorithm meets all of the mentioned requirements and outperforms other methods under realistic signaltonoise ratios and in the presence of overlapping spikes. Abstract Avian nucleus isthmi pars parvocellularis (Ipc) neurons are reciprocally connected with the layer 10 (L10) neurons in the optic tectum and respond with oscillatory bursts to visual stimulation. Our in vitro experiments show that both neuron types respond with regular spiking to somatic current injection and that the feedforward and feedback synaptic connections are excitatory, but of different strength and time course. To elucidate mechanisms of oscillatory bursting in this network of regularly spiking neurons, we investigated an experimentally constrained model of coupled leaky integrateandfire neurons with spikerate adaptation. The model reproduces the observed Ipc oscillatory bursting in response to simulated visual stimulation. A scan through the model parameter volume reveals that Ipc oscillatory burst generation can be caused by strong and brief feedforward synaptic conductance changes. The mechanism is sensitive to the parameter values of spikerate adaptation. In conclusion, we show that a network of regularspiking neurons with feedforward excitation and spikerate adaptation can generate oscillatory bursting in response to a constant input. Abstract Electrical stimulation of the central nervous system creates both orthodromically propagating action potentials, by stimulation of local cells and passing axons, and antidromically propagating action potentials, by stimulation of presynaptic axons and terminals. Our aim was to understand how antidromic action potentials navigate through complex arborizations, such as those of thalamic and basal ganglia afferents—sites of electrical activation during deep brain stimulation. We developed computational models to study the propagation of antidromic action potentials past the bifurcation in branched axons. In both unmyelinated and myelinated branched axons, when the diameters of each axon branch remained under a specific threshold (set by the antidromic geometric ratio), antidromic propagation occurred robustly; action potentials traveled both antidromically into the primary segment as well as “reorthodromically” into the terminal secondary segment. Propagation occurred across a broad range of stimulation frequencies, axon segment geometries, and concentrations of extracellular potassium, but was strongly dependent on the geometry of the node of Ranvier at the axonal bifurcation. Thus, antidromic activation of axon terminals can, through axon collaterals, lead to widespread activation or inhibition of targets remote from the site of stimulation. These effects should be included when interpreting the results of functional imaging or evoked potential studies on the mechanisms of action of DBS. Abstract The response of an oscillator to perturbations is described by its phaseresponse curve (PRC), which is related to the type of bifurcation leading from rest to tonic spiking. In a recent experimental study, we have shown that the type of PRC in cortical pyramidal neurons can be switched by cholinergic neuromodulation from type II (biphasic) to type I (monophasic). We explored how intrinsic mechanisms affected by acetylcholine influence the PRC using three different types of neuronal models: a theta neuron, singlecompartment neurons and a multicompartment neuron. In all of these models a decrease in the amount of a spikefrequency adaptation current was a necessary and sufficient condition for the shape of the PRC to change from biphasic (type II) to purely positive (type I). Abstract Small conductance (SK) calciumactivated potassium channels are found in many tissues throughout the body and open in response to elevations in intracellular calcium. In hippocampal neurons, SK channels are spatially colocalized with LType calcium channels. Due to the restriction of calcium transients into microdomains, only a limited number of LType Ca 2+ channels can activate SK and, thus, stochastic gating becomes relevant. Using a stochastic model with calcium microdomains, we predict that intracellular Ca 2+ fluctuations resulting from Ca 2+ channel gating can increase SK2 subthreshold activity by 1–2 orders of magnitude. This effectively reduces the value of the Hill coefficient. To explain the underlying mechanism, we show how short, highamplitude calcium pulses associated with stochastic gating of calcium channels are much more effective at activating SK2 channels than the steady calcium signal produced by a deterministic simulation. This stochastic amplification results from two factors: first, a supralinear rise in the SK2 channel’s steadystate activation curve at low calcium levels and, second, a momentary reduction in the channel’s time constant during the calcium pulse, causing the channel to approach its steadystate activation value much faster than it decays. Stochastic amplification can potentially explain subthreshold SK2 activation in unified models of both sub and suprathreshold regimes. Furthermore, we expect it to be a general phenomenon relevant to many proteins that are activated nonlinearly by stochastic ligand release. Abstract A tonicclonic seizure transitions from high frequency asynchronous activity to low frequency coherent oscillations, yet the mechanism of transition remains unknown. We propose a shift in network synchrony due to changes in cellular response. Here we use phaseresponse curves (PRC) from MorrisLecar (ML) model neurons with synaptic depression and gradually decrease input current to cells within a network simulation. This method effectively decreases firing rates resulting in a shift to greater network synchrony illustrating a possible mechanism of the transition phenomenon. PRCs are measured from the ML conductance based model cell with a range of input currents within the limit cycle. A large network of 3000 excitatory neurons is simulated with a network topology generated from secondorder statistics which allows a range of population synchrony. The population synchrony of the oscillating cells is measured with the Kuramoto order parameter, which reveals a transition from tonic to clonic phase exhibited by our model network. The cellular response shift mechanism for the tonicclonic seizure transition reproduces the population behavior closely when compared to EEG data. On the mechanisms underlying the depolarization block in the spiking dynamics of CA1 pyramidal neurons Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. A Neuroinformatics of Brain Modeling and its Implementation in the Brain Operation Database BODB Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Efficient fitting of conductance-based model neurons from somatic current clamp Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. A biologically constrained model of the whole basal ganglia addressing the paradoxes of connections and selection Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. KInNeSS: A Modular Framework for Computational Neuroscience Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Action and Language Mechanisms in the Brain: Data, Models and Neuroinformatics Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Effects of electrical coupling among layer 4 inhibitory interneurons on contrast-invariant orientation tuning Experimental Brain Research Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Ontogenetic Ritualization of Primate Gesture as a Case Study in Dyadic Brain Modeling Neuroinformatics Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Plant Genome DataBase Japan (PGDBj): A Portal Website for the Integration of Plant Genome-Related Databases. Plant & cell physiology DBM-DB: the diamondback moth genome database. Database : the journal of biological databases and curation Modeling and Validating Chronic Pharmacological Manipulation of Circadian Rhythms CPT: Pharmacometrics & Systems Pharmacology Circadian rhythms can be entrained by a light-dark (LD) cycle and can also be reset pharmacologically, for example, by the CK1δ/ε inhibitor PF-670462. Here, we determine how these two independent signals affect circadian timekeeping from the molecular to the behavioral level. By developing a systems pharmacology model, we predict and experimentally validate that chronic CK1δ/ε inhibition during the earlier hours of a LD cycle can produce a constant stable delay of rhythm. However, chronic dosing later during the day, or in the presence of longer light intervals, is not predicted to yield an entrained rhythm. We also propose a simple method based on phase response curves (PRCs) that predicts the effects of a LD cycle and chronic dosing of a circadian drug. This work indicates that dosing timing and environmental signals must be carefully considered for accurate pharmacological manipulation of circadian phase. CPT: Pharmacometrics & Systems Pharmacology (2013) 2, e57; doi:10.1038/psp.2013.34; published online 17 July 2013 Pharmacological upregulation of h-channels reduces the excitability of pyramidal neuron dendrites Nature Neuroscience The dendrites of pyramidal neurons have markedly different electrical properties from those of the soma, owing to the non-uniform distribution of voltage-gated ion channels in dendrites. It is thus possible that drugs acting on ion channels might preferentially alter dendritic, but not somatic, excitability. Using dendritic and somatic whole-cell and cell-attached recordings in rat hippocampal slices, we found that the anticonvulsant lamotrigine selectively reduced action potential firing from dendritic depolarization, while minimally affecting firing at the soma. This regional and input-specific effect resulted from an increase in the hyperpolarization-activated cation current (Ih), a voltage-gated current present predominantly in dendrites. These results demonstrate that neuronal excitability can be altered by drugs acting selectively on dendrites, and suggest an important role for Ih in controlling dendritic excitability and epileptogenesis. Temporal synchrony and gamma-to-theta power conversion in the dendrites of CA1 pyramidal neurons Nature Neuroscience Timing is a crucial aspect of synaptic integration. For pyramidal neurons that integrate thousands of synaptic inputs spread across hundreds of microns, it is thus a challenge to maintain the timing of incoming inputs at the axo-somatic integration site. Here we show that pyramidal neurons in the rodent hippocampus use a gradient of inductance in the form of hyperpolarization-activated cation-nonselective (HCN) channels as an active mechanism to counteract location-dependent temporal differences of dendritic inputs at the soma. Using simultaneous multi-site whole-cell recordings complemented by computational modeling, we find that this intrinsic biophysical mechanism produces temporal synchrony of rhythmic inputs in the theta and gamma frequency ranges across wide regions of the dendritic tree. While gamma and theta oscillations are known to synchronize activity across space in neuronal networks, our results identify a new mechanism by which this synchrony extends to activity within single pyramidal neurons with complex dendritic arbors. Axiope Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” The B6 database: a tool for the description and classification of vitamin B6-dependent enzymatic activities and of the corresponding protein families. BMC bioinformatics A new method to infer higher-order spike correlations from membrane potentials Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. A hybrid approach to shape-based interpolation of stereotactic atlases of the human brain Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Parameter estimate of signal transduction pathways BMC Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Bifurcations of large networks of two-dimensional integrate and fire neurons Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. The impact of the NIH public access policy on literature informatics Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. OREMPdb: a semantic dictionary of computational pathway models BMC Bioinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Model storage, exchange and integration BMC Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Abstract Because of its highly branched dendrite, the Purkinje neuron requires significant computational resources if coupled electrical and biochemical activity are to be simulated. To address this challenge, we developed a scheme for reducing the geometric complexity; while preserving the essential features of activity in both the soma and a remote dendritic spine. We merged our previously published biochemical model of calcium dynamics and lipid signaling in the Purkinje neuron, developed in the Virtual Cell modeling and simulation environment, with an electrophysiological model based on a Purkinje neuron model available in NEURON. A novel reduction method was applied to the Purkinje neuron geometry to obtain a model with fewer compartments that is tractable in Virtual Cell. Most of the dendritic tree was subject to reduction, but we retained the neuron’s explicit electrical and geometric features along a specified path from spine to soma. Further, unlike previous simplification methods, the dendrites that branch off along the preserved explicit path are retained as reduced branches. We conserved axial resistivity and adjusted passive properties and active channel conductances for the reduction in surface area, and cytosolic calcium for the reduction in volume. Rallpacks are used to validate the reduction algorithm and show that it can be generalized to other complex neuronal geometries. For the Purkinje cell, we found that current injections at the soma were able to produce similar trains of action potentials and membrane potential propagation in the full and reduced models in NEURON; the reduced model produces identical spiking patterns in NEURON and Virtual Cell. Importantly, our reduced model can simulate communication between the soma and a distal spine; an alpha function applied at the spine to represent synaptic stimulation gave similar results in the full and reduced models for potential changes associated with both the spine and the soma. Finally, we combined phosphoinositol signaling and electrophysiology in the reduced model in Virtual Cell. Thus, a strategy has been developed to combine electrophysiology and biochemistry as a step toward merging neuronal and systems biology modeling. Abstract The advent of techniques with the ability to scan massive changes in cellular makeup (genomics, proteomics, etc.) has revealed the compelling need for analytical methods to interpret and make sense of those changes. Computational models built on sound physicochemical mechanistic basis are unavoidable at the time of integrating, interpreting, and simulating highthroughput experimental data. Another powerful role of computational models is predicting new behavior provided they are adequately validated.Mitochondrial energy transduction has been traditionally studied with thermodynamic models. More recently, kinetic or thermokinetic models have been proposed, leading the path toward an understanding of the control and regulation of mitochondrial energy metabolism and its interaction with cytoplasmic and other compartments. In this work, we outline the methods, stepbystep, that should be followed to build a computational model of mitochondrial energetics in isolation or integrated to a network of cellular processes. Depending on the question addressed by the modeler, the methodology explained herein can be applied with different levels of detail, from the mitochondrial energy producing machinery in a network of cellular processes to the dynamics of a single enzyme during its catalytic cycle. Abstract The voltage and time dependence of ion channels can be regulated, notably by phosphorylation, interaction with phospholipids, and binding to auxiliary subunits. Many parameter variation studies have set conductance densities free while leaving kinetic channel properties fixed as the experimental constraints on the latter are usually better than on the former. Because individual cells can tightly regulate their ion channel properties, we suggest that kinetic parameters may be profitably set free during model optimization in order to both improve matches to data and refine kinetic parameters. To this end, we analyzed the parameter optimization of reduced models of three electrophysiologically characterized and morphologically reconstructed globus pallidus neurons. We performed two automated searches with different types of free parameters. First, conductance density parameters were set free. Even the best resulting models exhibited unavoidable problems which were due to limitations in our channel kinetics. We next set channel kinetics free for the optimized density matches and obtained significantly improved model performance. Some kinetic parameters consistently shifted to similar new values in multiple runs across three models, suggesting the possibility for tailored improvements to channel models. These results suggest that optimized channel kinetics can improve model matches to experimental voltage traces, particularly for channels characterized under different experimental conditions than recorded data to be matched by a model. The resulting shifts in channel kinetics from the original template provide valuable guidance for future experimental efforts to determine the detailed kinetics of channel isoforms and possible modulated states in particular types of neurons. Abstract Electrical synapses continuously transfer signals bidirectionally from one cell to another, directly or indirectly via intermediate cells. Electrical synapses are common in many brain structures such as the inferior olive, the subcoeruleus nucleus and the neocortex, between neurons and between glial cells. In the cortex, interneurons have been shown to be electrically coupled and proposed to participate in large, continuous cortical syncytia, as opposed to smaller spatial domains of electrically coupled cells. However, to explore the significance of these findings it is imperative to map the electrical synaptic microcircuits, in analogy with in vitro studies on monosynaptic and disynaptic chemical coupling. Since “walking” from cell to cell over large distances with a glass pipette is challenging, microinjection of (fluorescent) dyes diffusing through gapjunctions remains so far the only method available to decipher such microcircuits even though technical limitations exist. Based on circuit theory, we derive analytical descriptions of the AC electrical coupling in networks of isopotential cells. We then suggest an operative electrophysiological protocol to distinguish between direct electrical connections and connections involving one or more intermediate cells. This method allows inferring the number of intermediate cells, generalizing the conventional coupling coefficient, which provides limited information. We validate our method through computer simulations, theoretical and numerical methods and electrophysiological paired recordings. Abstract Because electrical coupling among the neurons of the brain is much faster than chemical synaptic coupling, it is natural to hypothesize that gap junctions may play a crucial role in mechanisms underlying very fast oscillations (VFOs), i.e., oscillations at more than 80 Hz. There is now a substantial body of experimental and modeling literature supporting this hypothesis. A series of modeling papers, starting with work by Roger Traub and collaborators, have suggested that VFOs may arise from expanding waves propagating through an “axonal plexus”, a large random network of electrically coupled axons. Traub et al. also proposed a cellular automaton (CA) model to study the mechanisms of VFOs in the axonal plexus. In this model, the expanding waves take the appearance of topologically circular “target patterns”. Random external stimuli initiate each wave. We therefore call this kind of VFO “externally driven”. Using a computational model, we show that an axonal plexus can also exhibit a second, distinctly different kind of VFO in a wide parameter range. These VFOs arise from activity propagating around cycles in the network. Once triggered, they persist without any source of excitation. With idealized, regular connectivity, they take the appearance of spiral waves. We call these VFOs “reentrant”. The behavior of the axonal plexus depends on the reliability with which action potentials propagate from one axon to the next, which, in turn, depends on the somatic membrane potential V s and the gap junction conductance g gj . To study these dependencies, we impose a fixed value of V s , then study the effects of varying V s and g gj . Not surprisingly, propagation becomes more reliable with rising V s and g gj . Externally driven VFOs occur when V s and g gj are so high that propagation never fails. For lower V s or g gj , propagation is nearly reliable, but fails in rare circumstances. Surprisingly, the parameter regime where this occurs is fairly large. Even a single propagation failure can trigger reentrant VFOs in this regime. Lowering V s and g gj further, one finds a third parameter regime in which propagation is unreliable, and no VFOs arise. We analyze these three parameter regimes by means of computations using model networks adapted from Traub et al., as well as much smaller model networks. Abstract Research with barn owls suggested that sound source location is represented topographically in the brain by an array of neurons each tuned to a narrow range of locations. However, research with smallheaded mammals has offered an alternative view in which location is represented by the balance of activity in two opponent channels broadly tuned to the left and right auditory space. Both channels may be present in each auditory cortex, although the channel representing contralateral space may be dominant. Recent studies have suggested that opponent channel coding of space may also apply in humans, although these studies have used a restricted set of spatial cues or probed a restricted set of spatial locations, and there have been contradictory reports as to the relative dominance of the ipsilateral and contralateral channels in each cortex. The current study used electroencephalography (EEG) in conjunction with sound field stimulus presentation to address these issues and to inform the development of an explicit computational model of human sound source localization. Neural responses were compatible with the opponent channel account of sound source localization and with contralateral channel dominance in the left, but not the right, auditory cortex. A computational opponent channel model reproduced every important aspect of the EEG data and allowed inferences about the width of tuning in the spatial channels. Moreover, the model predicted the oftreported decrease in spatial acuity measured psychophysically with increasing reference azimuth. Predictions of spatial acuity closely matched those measured psychophysically by previous authors. Abstract Calretinin is thought to be the main endogenous calcium buffer in cerebellar granule cells (GrCs). However, little is known about the impact of cooperative Ca 2+ binding to calretinin on highly localized and more global (regional) Ca 2+ signals in these cells. Using numerical simulations, we show that an essential property of calretinin is a delayed equilibration with Ca 2+ . Therefore, the amount of Ca 2+ , which calretinin can accumulate with respect to equilibrium levels, depends on stimulus conditions. Based on our simulations of buffered Ca 2+ diffusion near a single Ca 2+ channel or a large cluster of Ca 2+ channels and previous experimental findings that 150 μM 1,2bis(oaminophenoxy) ethane N , N , N ′, N ′tetraacetic acid (BAPTA) and endogenous calretinin have similar effects on GrC excitability, we estimated the concentration of mobile calretinin in GrCs in the range of 0.7–1.2 mM. Our results suggest that this estimate can provide a starting point for further analysis. We find that calretinin prominently reduces the action potential associated increase in cytosolic free Ca 2+ concentration ([Ca 2+ ] i ) even at a distance of 30 nm from a single Ca 2+ channel. In spite of a buildup of residual Ca 2+ , it maintains almost constant maximal [Ca 2+ ] i levels during repetitive channel openings with a frequency less than 80 Hz. This occurs because of accelerated Ca 2+ binding as calretinin binds more Ca 2+ . Unlike the buffering of high Ca 2+ levels within Ca 2+ nano/microdomains sensed by large conductance Ca 2+ activated K + channels, the buffering of regional Ca 2+ signals by calretinin can never be mimicked by certain concentration of BAPTA under all different experimental conditions. Abstract The field of Computational Systems Neurobiology is maturing quickly. If one wants it to fulfil its central role in the new Integrative Neurobiology, the reuse of quantitative models needs to be facilitated. The community has to develop standards and guidelines in order to maximise the diffusion of its scientific production, but also to render it more trustworthy. In the recent years, various projects tackled the problems of the syntax and semantics of quantitative models. More recently the international initiative BioModels.net launched three projects: (1) MIRIAM is a standard to curate and annotate models, in order to facilitate their reuse. (2) The Systems Biology Ontology is a set of controlled vocabularies aimed to be used in conjunction with models, in order to characterise their components. (3) BioModels Database is a resource that allows biologists to store, search and retrieve published mathematical models of biological interests. We expect that those resources, together with the use of formal languages such as SBML, will support the fruitful exchange and reuse of quantitative models. Collaborative Modeling in Neuroscience: Time to Go Open Model? Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. SynapticDB, Effective Web-based Management and Sharing of Data from Serial Section Electron Microscopy Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Abstract Because of its highly branched dendrite, the Purkinje neuron requires significant computational resources if coupled electrical and biochemical activity are to be simulated. To address this challenge, we developed a scheme for reducing the geometric complexity; while preserving the essential features of activity in both the soma and a remote dendritic spine. We merged our previously published biochemical model of calcium dynamics and lipid signaling in the Purkinje neuron, developed in the Virtual Cell modeling and simulation environment, with an electrophysiological model based on a Purkinje neuron model available in NEURON. A novel reduction method was applied to the Purkinje neuron geometry to obtain a model with fewer compartments that is tractable in Virtual Cell. Most of the dendritic tree was subject to reduction, but we retained the neuron’s explicit electrical and geometric features along a specified path from spine to soma. Further, unlike previous simplification methods, the dendrites that branch off along the preserved explicit path are retained as reduced branches. We conserved axial resistivity and adjusted passive properties and active channel conductances for the reduction in surface area, and cytosolic calcium for the reduction in volume. Rallpacks are used to validate the reduction algorithm and show that it can be generalized to other complex neuronal geometries. For the Purkinje cell, we found that current injections at the soma were able to produce similar trains of action potentials and membrane potential propagation in the full and reduced models in NEURON; the reduced model produces identical spiking patterns in NEURON and Virtual Cell. Importantly, our reduced model can simulate communication between the soma and a distal spine; an alpha function applied at the spine to represent synaptic stimulation gave similar results in the full and reduced models for potential changes associated with both the spine and the soma. Finally, we combined phosphoinositol signaling and electrophysiology in the reduced model in Virtual Cell. Thus, a strategy has been developed to combine electrophysiology and biochemistry as a step toward merging neuronal and systems biology modeling. Abstract The advent of techniques with the ability to scan massive changes in cellular makeup (genomics, proteomics, etc.) has revealed the compelling need for analytical methods to interpret and make sense of those changes. Computational models built on sound physicochemical mechanistic basis are unavoidable at the time of integrating, interpreting, and simulating highthroughput experimental data. Another powerful role of computational models is predicting new behavior provided they are adequately validated.Mitochondrial energy transduction has been traditionally studied with thermodynamic models. More recently, kinetic or thermokinetic models have been proposed, leading the path toward an understanding of the control and regulation of mitochondrial energy metabolism and its interaction with cytoplasmic and other compartments. In this work, we outline the methods, stepbystep, that should be followed to build a computational model of mitochondrial energetics in isolation or integrated to a network of cellular processes. Depending on the question addressed by the modeler, the methodology explained herein can be applied with different levels of detail, from the mitochondrial energy producing machinery in a network of cellular processes to the dynamics of a single enzyme during its catalytic cycle. Abstract The voltage and time dependence of ion channels can be regulated, notably by phosphorylation, interaction with phospholipids, and binding to auxiliary subunits. Many parameter variation studies have set conductance densities free while leaving kinetic channel properties fixed as the experimental constraints on the latter are usually better than on the former. Because individual cells can tightly regulate their ion channel properties, we suggest that kinetic parameters may be profitably set free during model optimization in order to both improve matches to data and refine kinetic parameters. To this end, we analyzed the parameter optimization of reduced models of three electrophysiologically characterized and morphologically reconstructed globus pallidus neurons. We performed two automated searches with different types of free parameters. First, conductance density parameters were set free. Even the best resulting models exhibited unavoidable problems which were due to limitations in our channel kinetics. We next set channel kinetics free for the optimized density matches and obtained significantly improved model performance. Some kinetic parameters consistently shifted to similar new values in multiple runs across three models, suggesting the possibility for tailored improvements to channel models. These results suggest that optimized channel kinetics can improve model matches to experimental voltage traces, particularly for channels characterized under different experimental conditions than recorded data to be matched by a model. The resulting shifts in channel kinetics from the original template provide valuable guidance for future experimental efforts to determine the detailed kinetics of channel isoforms and possible modulated states in particular types of neurons. Abstract Electrical synapses continuously transfer signals bidirectionally from one cell to another, directly or indirectly via intermediate cells. Electrical synapses are common in many brain structures such as the inferior olive, the subcoeruleus nucleus and the neocortex, between neurons and between glial cells. In the cortex, interneurons have been shown to be electrically coupled and proposed to participate in large, continuous cortical syncytia, as opposed to smaller spatial domains of electrically coupled cells. However, to explore the significance of these findings it is imperative to map the electrical synaptic microcircuits, in analogy with in vitro studies on monosynaptic and disynaptic chemical coupling. Since “walking” from cell to cell over large distances with a glass pipette is challenging, microinjection of (fluorescent) dyes diffusing through gapjunctions remains so far the only method available to decipher such microcircuits even though technical limitations exist. Based on circuit theory, we derive analytical descriptions of the AC electrical coupling in networks of isopotential cells. We then suggest an operative electrophysiological protocol to distinguish between direct electrical connections and connections involving one or more intermediate cells. This method allows inferring the number of intermediate cells, generalizing the conventional coupling coefficient, which provides limited information. We validate our method through computer simulations, theoretical and numerical methods and electrophysiological paired recordings. Abstract Because electrical coupling among the neurons of the brain is much faster than chemical synaptic coupling, it is natural to hypothesize that gap junctions may play a crucial role in mechanisms underlying very fast oscillations (VFOs), i.e., oscillations at more than 80 Hz. There is now a substantial body of experimental and modeling literature supporting this hypothesis. A series of modeling papers, starting with work by Roger Traub and collaborators, have suggested that VFOs may arise from expanding waves propagating through an “axonal plexus”, a large random network of electrically coupled axons. Traub et al. also proposed a cellular automaton (CA) model to study the mechanisms of VFOs in the axonal plexus. In this model, the expanding waves take the appearance of topologically circular “target patterns”. Random external stimuli initiate each wave. We therefore call this kind of VFO “externally driven”. Using a computational model, we show that an axonal plexus can also exhibit a second, distinctly different kind of VFO in a wide parameter range. These VFOs arise from activity propagating around cycles in the network. Once triggered, they persist without any source of excitation. With idealized, regular connectivity, they take the appearance of spiral waves. We call these VFOs “reentrant”. The behavior of the axonal plexus depends on the reliability with which action potentials propagate from one axon to the next, which, in turn, depends on the somatic membrane potential V s and the gap junction conductance g gj . To study these dependencies, we impose a fixed value of V s , then study the effects of varying V s and g gj . Not surprisingly, propagation becomes more reliable with rising V s and g gj . Externally driven VFOs occur when V s and g gj are so high that propagation never fails. For lower V s or g gj , propagation is nearly reliable, but fails in rare circumstances. Surprisingly, the parameter regime where this occurs is fairly large. Even a single propagation failure can trigger reentrant VFOs in this regime. Lowering V s and g gj further, one finds a third parameter regime in which propagation is unreliable, and no VFOs arise. We analyze these three parameter regimes by means of computations using model networks adapted from Traub et al., as well as much smaller model networks. Abstract Research with barn owls suggested that sound source location is represented topographically in the brain by an array of neurons each tuned to a narrow range of locations. However, research with smallheaded mammals has offered an alternative view in which location is represented by the balance of activity in two opponent channels broadly tuned to the left and right auditory space. Both channels may be present in each auditory cortex, although the channel representing contralateral space may be dominant. Recent studies have suggested that opponent channel coding of space may also apply in humans, although these studies have used a restricted set of spatial cues or probed a restricted set of spatial locations, and there have been contradictory reports as to the relative dominance of the ipsilateral and contralateral channels in each cortex. The current study used electroencephalography (EEG) in conjunction with sound field stimulus presentation to address these issues and to inform the development of an explicit computational model of human sound source localization. Neural responses were compatible with the opponent channel account of sound source localization and with contralateral channel dominance in the left, but not the right, auditory cortex. A computational opponent channel model reproduced every important aspect of the EEG data and allowed inferences about the width of tuning in the spatial channels. Moreover, the model predicted the oftreported decrease in spatial acuity measured psychophysically with increasing reference azimuth. Predictions of spatial acuity closely matched those measured psychophysically by previous authors. Abstract Calretinin is thought to be the main endogenous calcium buffer in cerebellar granule cells (GrCs). However, little is known about the impact of cooperative Ca 2+ binding to calretinin on highly localized and more global (regional) Ca 2+ signals in these cells. Using numerical simulations, we show that an essential property of calretinin is a delayed equilibration with Ca 2+ . Therefore, the amount of Ca 2+ , which calretinin can accumulate with respect to equilibrium levels, depends on stimulus conditions. Based on our simulations of buffered Ca 2+ diffusion near a single Ca 2+ channel or a large cluster of Ca 2+ channels and previous experimental findings that 150 μM 1,2bis(oaminophenoxy) ethane N , N , N ′, N ′tetraacetic acid (BAPTA) and endogenous calretinin have similar effects on GrC excitability, we estimated the concentration of mobile calretinin in GrCs in the range of 0.7–1.2 mM. Our results suggest that this estimate can provide a starting point for further analysis. We find that calretinin prominently reduces the action potential associated increase in cytosolic free Ca 2+ concentration ([Ca 2+ ] i ) even at a distance of 30 nm from a single Ca 2+ channel. In spite of a buildup of residual Ca 2+ , it maintains almost constant maximal [Ca 2+ ] i levels during repetitive channel openings with a frequency less than 80 Hz. This occurs because of accelerated Ca 2+ binding as calretinin binds more Ca 2+ . Unlike the buffering of high Ca 2+ levels within Ca 2+ nano/microdomains sensed by large conductance Ca 2+ activated K + channels, the buffering of regional Ca 2+ signals by calretinin can never be mimicked by certain concentration of BAPTA under all different experimental conditions. Abstract The field of Computational Systems Neurobiology is maturing quickly. If one wants it to fulfil its central role in the new Integrative Neurobiology, the reuse of quantitative models needs to be facilitated. The community has to develop standards and guidelines in order to maximise the diffusion of its scientific production, but also to render it more trustworthy. In the recent years, various projects tackled the problems of the syntax and semantics of quantitative models. More recently the international initiative BioModels.net launched three projects: (1) MIRIAM is a standard to curate and annotate models, in order to facilitate their reuse. (2) The Systems Biology Ontology is a set of controlled vocabularies aimed to be used in conjunction with models, in order to characterise their components. (3) BioModels Database is a resource that allows biologists to store, search and retrieve published mathematical models of biological interests. We expect that those resources, together with the use of formal languages such as SBML, will support the fruitful exchange and reuse of quantitative models. Abstract Understanding the direction and quantity of information flowing in neuronal networks is a fundamental problem in neuroscience. Brains and neuronal networks must at the same time store information about the world and react to information in the world. We sought to measure how the activity of the network alters information flow from inputs to output patterns. Using neocortical column neuronal network simulations, we demonstrated that networks with greater internal connectivity reduced input/output correlations from excitatory synapses and decreased negative correlations from inhibitory synapses, measured by Kendall’s τ correlation. Both of these changes were associated with reduction in information flow, measured by normalized transfer entropy ( n TE). Information handling by the network reflected the degree of internal connectivity. With no internal connectivity, the feedforward network transformed inputs through nonlinear summation and thresholding. With greater connectivity strength, the recurrent network translated activity and information due to contribution of activity from intrinsic network dynamics. This dynamic contribution amounts to added information drawn from that stored in the network. At still higher internal synaptic strength, the network corrupted the external information, producing a state where little external information came through. The association of increased information retrieved from the network with increased gamma power supports the notion of gamma oscillations playing a role in information processing. Abstract Intracellular Ca 2+ concentrations play a crucial role in the physiological interaction between Ca 2+ channels and Ca 2+ activated K + channels. The commonly used model, a Ca 2+ pool with a short relaxation time, fails to simulate interactions occurring at multiple time scales. On the other hand, detailed computational models including various Ca 2+ buffers and pumps can result in large computational cost due to radial diffusion in large compartments, which may be undesirable when simulating morphologically detailed Purkinje cell models. We present a method using a compensating mechanism to replace radial diffusion and compared the dynamics of different Ca 2+ buffering models during generation of a dendritic Ca 2+ spike in a single compartment model of a PC dendritic segment with Ca 2+ channels of P and Ttype and Ca 2+ activated K + channels of BK and SKtype. The Ca 2+ dynamics models used are (1) a single Ca 2+ pool; (2) two Ca 2+ pools, respectively, for the fast and slow transients; (3) detailed Ca 2+ dynamics with buffers, pump, and diffusion; and (4) detailed Ca 2+ dynamics with buffers, pump, and diffusion compensation. Our results show that detailed Ca 2+ dynamics models have significantly better control over Ca 2+ activated K + channels and lead to physiologically more realistic simulations of Ca 2+ spikes and bursting. Furthermore, the compensating mechanism largely eliminates the effect of removing diffusion from the model on Ca 2+ dynamics over multiple time scales. Abstract This paper describes the capabilities of DISCO, an extensible approach that supports integrative Webbased information dissemination. DISCO is a component of the Neuroscience Information Framework (NIF), an NIH Neuroscience Blueprint initiative that facilitates integrated access to diverse neuroscience resources via the Internet. DISCO facilitates the automated maintenance of several distinct capabilities using a collection of files 1) that are maintained locally by the developers of participating neuroscience resources and 2) that are “harvested” on a regular basis by a central DISCO server. This approach allows central NIF capabilities to be updated as each resource’s content changes over time. DISCO currently supports the following capabilities: 1) resource descriptions, 2) “LinkOut” to a resource’s data items from NCBI Entrez resources such as PubMed, 3) Webbased interoperation with a resource, 4) sharing a resource’s lexicon and ontology, 5) sharing a resource’s database schema, and 6) participation by the resource in neurosciencerelated RSS news dissemination. The developers of a resource are free to choose which DISCO capabilities their resource will participate in. Although DISCO is used by NIF to facilitate neuroscience data integration, its capabilities have general applicability to other areas of research. Abstract Spiking neural network simulations incorporating variable transmission delays require synaptic events to be scheduled prior to delivery. Conventional methods have memory requirements that scale with the total number of synapses in a network. We introduce novel scheduling algorithms for both discrete and continuous event delivery, where the memory requirement scales instead with the number of neurons. Superior algorithmic performance is demonstrated using largescale, benchmarking network simulations. Abstract Serial section electron microscopy (ssEM) is rapidly expanding as a primary tool to investigate synaptic circuitry and plasticity. The ultrastructural images collected through ssEM are content rich and their comprehensive analysis is beyond the capacity of an individual laboratory. Hence, sharing ultrastructural data is becoming crucial to visualize, analyze, and discover the structural basis of synaptic circuitry and function in the brain. We devised a webbased management system called SynapticDB ( http://synapses.clm.utexas.edu/synapticdb/ ) that catalogues, extracts, analyzes, and shares experimental data from ssEM. The management strategy involves a library with checkin, checkout and experimental tracking mechanisms. We developed a series of spreadsheet templates (MS Excel, Open Office spreadsheet, etc) that guide users in methods of data collection, structural identification, and quantitative analysis through ssEM. SynapticDB provides flexible access to complete templates, or to individual columns with instructional headers that can be selected to create userdefined templates. New templates can also be generated and uploaded. Research progress is tracked via experimental note management and dynamic PDF forms that allow new investigators to follow standard protocols and experienced researchers to expand the range of data collected and shared. The combined use of templates and tracking notes ensures that the supporting experimental information is populated into the database and associated with the appropriate ssEM images and analyses. We anticipate that SynapticDB will serve future metaanalyses towards new discoveries about the composition and circuitry of neurons and glia, and new understanding about structural plasticity during development, behavior, learning, memory, and neuropathology. The NIF DISCO Framework: Facilitating Automated Integration of Neuroscience Content on the Web Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Abstract Because of its highly branched dendrite, the Purkinje neuron requires significant computational resources if coupled electrical and biochemical activity are to be simulated. To address this challenge, we developed a scheme for reducing the geometric complexity; while preserving the essential features of activity in both the soma and a remote dendritic spine. We merged our previously published biochemical model of calcium dynamics and lipid signaling in the Purkinje neuron, developed in the Virtual Cell modeling and simulation environment, with an electrophysiological model based on a Purkinje neuron model available in NEURON. A novel reduction method was applied to the Purkinje neuron geometry to obtain a model with fewer compartments that is tractable in Virtual Cell. Most of the dendritic tree was subject to reduction, but we retained the neuron’s explicit electrical and geometric features along a specified path from spine to soma. Further, unlike previous simplification methods, the dendrites that branch off along the preserved explicit path are retained as reduced branches. We conserved axial resistivity and adjusted passive properties and active channel conductances for the reduction in surface area, and cytosolic calcium for the reduction in volume. Rallpacks are used to validate the reduction algorithm and show that it can be generalized to other complex neuronal geometries. For the Purkinje cell, we found that current injections at the soma were able to produce similar trains of action potentials and membrane potential propagation in the full and reduced models in NEURON; the reduced model produces identical spiking patterns in NEURON and Virtual Cell. Importantly, our reduced model can simulate communication between the soma and a distal spine; an alpha function applied at the spine to represent synaptic stimulation gave similar results in the full and reduced models for potential changes associated with both the spine and the soma. Finally, we combined phosphoinositol signaling and electrophysiology in the reduced model in Virtual Cell. Thus, a strategy has been developed to combine electrophysiology and biochemistry as a step toward merging neuronal and systems biology modeling. Abstract The advent of techniques with the ability to scan massive changes in cellular makeup (genomics, proteomics, etc.) has revealed the compelling need for analytical methods to interpret and make sense of those changes. Computational models built on sound physicochemical mechanistic basis are unavoidable at the time of integrating, interpreting, and simulating highthroughput experimental data. Another powerful role of computational models is predicting new behavior provided they are adequately validated.Mitochondrial energy transduction has been traditionally studied with thermodynamic models. More recently, kinetic or thermokinetic models have been proposed, leading the path toward an understanding of the control and regulation of mitochondrial energy metabolism and its interaction with cytoplasmic and other compartments. In this work, we outline the methods, stepbystep, that should be followed to build a computational model of mitochondrial energetics in isolation or integrated to a network of cellular processes. Depending on the question addressed by the modeler, the methodology explained herein can be applied with different levels of detail, from the mitochondrial energy producing machinery in a network of cellular processes to the dynamics of a single enzyme during its catalytic cycle. Abstract The voltage and time dependence of ion channels can be regulated, notably by phosphorylation, interaction with phospholipids, and binding to auxiliary subunits. Many parameter variation studies have set conductance densities free while leaving kinetic channel properties fixed as the experimental constraints on the latter are usually better than on the former. Because individual cells can tightly regulate their ion channel properties, we suggest that kinetic parameters may be profitably set free during model optimization in order to both improve matches to data and refine kinetic parameters. To this end, we analyzed the parameter optimization of reduced models of three electrophysiologically characterized and morphologically reconstructed globus pallidus neurons. We performed two automated searches with different types of free parameters. First, conductance density parameters were set free. Even the best resulting models exhibited unavoidable problems which were due to limitations in our channel kinetics. We next set channel kinetics free for the optimized density matches and obtained significantly improved model performance. Some kinetic parameters consistently shifted to similar new values in multiple runs across three models, suggesting the possibility for tailored improvements to channel models. These results suggest that optimized channel kinetics can improve model matches to experimental voltage traces, particularly for channels characterized under different experimental conditions than recorded data to be matched by a model. The resulting shifts in channel kinetics from the original template provide valuable guidance for future experimental efforts to determine the detailed kinetics of channel isoforms and possible modulated states in particular types of neurons. Abstract Electrical synapses continuously transfer signals bidirectionally from one cell to another, directly or indirectly via intermediate cells. Electrical synapses are common in many brain structures such as the inferior olive, the subcoeruleus nucleus and the neocortex, between neurons and between glial cells. In the cortex, interneurons have been shown to be electrically coupled and proposed to participate in large, continuous cortical syncytia, as opposed to smaller spatial domains of electrically coupled cells. However, to explore the significance of these findings it is imperative to map the electrical synaptic microcircuits, in analogy with in vitro studies on monosynaptic and disynaptic chemical coupling. Since “walking” from cell to cell over large distances with a glass pipette is challenging, microinjection of (fluorescent) dyes diffusing through gapjunctions remains so far the only method available to decipher such microcircuits even though technical limitations exist. Based on circuit theory, we derive analytical descriptions of the AC electrical coupling in networks of isopotential cells. We then suggest an operative electrophysiological protocol to distinguish between direct electrical connections and connections involving one or more intermediate cells. This method allows inferring the number of intermediate cells, generalizing the conventional coupling coefficient, which provides limited information. We validate our method through computer simulations, theoretical and numerical methods and electrophysiological paired recordings. Abstract Because electrical coupling among the neurons of the brain is much faster than chemical synaptic coupling, it is natural to hypothesize that gap junctions may play a crucial role in mechanisms underlying very fast oscillations (VFOs), i.e., oscillations at more than 80 Hz. There is now a substantial body of experimental and modeling literature supporting this hypothesis. A series of modeling papers, starting with work by Roger Traub and collaborators, have suggested that VFOs may arise from expanding waves propagating through an “axonal plexus”, a large random network of electrically coupled axons. Traub et al. also proposed a cellular automaton (CA) model to study the mechanisms of VFOs in the axonal plexus. In this model, the expanding waves take the appearance of topologically circular “target patterns”. Random external stimuli initiate each wave. We therefore call this kind of VFO “externally driven”. Using a computational model, we show that an axonal plexus can also exhibit a second, distinctly different kind of VFO in a wide parameter range. These VFOs arise from activity propagating around cycles in the network. Once triggered, they persist without any source of excitation. With idealized, regular connectivity, they take the appearance of spiral waves. We call these VFOs “reentrant”. The behavior of the axonal plexus depends on the reliability with which action potentials propagate from one axon to the next, which, in turn, depends on the somatic membrane potential V s and the gap junction conductance g gj . To study these dependencies, we impose a fixed value of V s , then study the effects of varying V s and g gj . Not surprisingly, propagation becomes more reliable with rising V s and g gj . Externally driven VFOs occur when V s and g gj are so high that propagation never fails. For lower V s or g gj , propagation is nearly reliable, but fails in rare circumstances. Surprisingly, the parameter regime where this occurs is fairly large. Even a single propagation failure can trigger reentrant VFOs in this regime. Lowering V s and g gj further, one finds a third parameter regime in which propagation is unreliable, and no VFOs arise. We analyze these three parameter regimes by means of computations using model networks adapted from Traub et al., as well as much smaller model networks. Abstract Research with barn owls suggested that sound source location is represented topographically in the brain by an array of neurons each tuned to a narrow range of locations. However, research with smallheaded mammals has offered an alternative view in which location is represented by the balance of activity in two opponent channels broadly tuned to the left and right auditory space. Both channels may be present in each auditory cortex, although the channel representing contralateral space may be dominant. Recent studies have suggested that opponent channel coding of space may also apply in humans, although these studies have used a restricted set of spatial cues or probed a restricted set of spatial locations, and there have been contradictory reports as to the relative dominance of the ipsilateral and contralateral channels in each cortex. The current study used electroencephalography (EEG) in conjunction with sound field stimulus presentation to address these issues and to inform the development of an explicit computational model of human sound source localization. Neural responses were compatible with the opponent channel account of sound source localization and with contralateral channel dominance in the left, but not the right, auditory cortex. A computational opponent channel model reproduced every important aspect of the EEG data and allowed inferences about the width of tuning in the spatial channels. Moreover, the model predicted the oftreported decrease in spatial acuity measured psychophysically with increasing reference azimuth. Predictions of spatial acuity closely matched those measured psychophysically by previous authors. Abstract Calretinin is thought to be the main endogenous calcium buffer in cerebellar granule cells (GrCs). However, little is known about the impact of cooperative Ca 2+ binding to calretinin on highly localized and more global (regional) Ca 2+ signals in these cells. Using numerical simulations, we show that an essential property of calretinin is a delayed equilibration with Ca 2+ . Therefore, the amount of Ca 2+ , which calretinin can accumulate with respect to equilibrium levels, depends on stimulus conditions. Based on our simulations of buffered Ca 2+ diffusion near a single Ca 2+ channel or a large cluster of Ca 2+ channels and previous experimental findings that 150 μM 1,2bis(oaminophenoxy) ethane N , N , N ′, N ′tetraacetic acid (BAPTA) and endogenous calretinin have similar effects on GrC excitability, we estimated the concentration of mobile calretinin in GrCs in the range of 0.7–1.2 mM. Our results suggest that this estimate can provide a starting point for further analysis. We find that calretinin prominently reduces the action potential associated increase in cytosolic free Ca 2+ concentration ([Ca 2+ ] i ) even at a distance of 30 nm from a single Ca 2+ channel. In spite of a buildup of residual Ca 2+ , it maintains almost constant maximal [Ca 2+ ] i levels during repetitive channel openings with a frequency less than 80 Hz. This occurs because of accelerated Ca 2+ binding as calretinin binds more Ca 2+ . Unlike the buffering of high Ca 2+ levels within Ca 2+ nano/microdomains sensed by large conductance Ca 2+ activated K + channels, the buffering of regional Ca 2+ signals by calretinin can never be mimicked by certain concentration of BAPTA under all different experimental conditions. Abstract The field of Computational Systems Neurobiology is maturing quickly. If one wants it to fulfil its central role in the new Integrative Neurobiology, the reuse of quantitative models needs to be facilitated. The community has to develop standards and guidelines in order to maximise the diffusion of its scientific production, but also to render it more trustworthy. In the recent years, various projects tackled the problems of the syntax and semantics of quantitative models. More recently the international initiative BioModels.net launched three projects: (1) MIRIAM is a standard to curate and annotate models, in order to facilitate their reuse. (2) The Systems Biology Ontology is a set of controlled vocabularies aimed to be used in conjunction with models, in order to characterise their components. (3) BioModels Database is a resource that allows biologists to store, search and retrieve published mathematical models of biological interests. We expect that those resources, together with the use of formal languages such as SBML, will support the fruitful exchange and reuse of quantitative models. Abstract Understanding the direction and quantity of information flowing in neuronal networks is a fundamental problem in neuroscience. Brains and neuronal networks must at the same time store information about the world and react to information in the world. We sought to measure how the activity of the network alters information flow from inputs to output patterns. Using neocortical column neuronal network simulations, we demonstrated that networks with greater internal connectivity reduced input/output correlations from excitatory synapses and decreased negative correlations from inhibitory synapses, measured by Kendall’s τ correlation. Both of these changes were associated with reduction in information flow, measured by normalized transfer entropy ( n TE). Information handling by the network reflected the degree of internal connectivity. With no internal connectivity, the feedforward network transformed inputs through nonlinear summation and thresholding. With greater connectivity strength, the recurrent network translated activity and information due to contribution of activity from intrinsic network dynamics. This dynamic contribution amounts to added information drawn from that stored in the network. At still higher internal synaptic strength, the network corrupted the external information, producing a state where little external information came through. The association of increased information retrieved from the network with increased gamma power supports the notion of gamma oscillations playing a role in information processing. Abstract Intracellular Ca 2+ concentrations play a crucial role in the physiological interaction between Ca 2+ channels and Ca 2+ activated K + channels. The commonly used model, a Ca 2+ pool with a short relaxation time, fails to simulate interactions occurring at multiple time scales. On the other hand, detailed computational models including various Ca 2+ buffers and pumps can result in large computational cost due to radial diffusion in large compartments, which may be undesirable when simulating morphologically detailed Purkinje cell models. We present a method using a compensating mechanism to replace radial diffusion and compared the dynamics of different Ca 2+ buffering models during generation of a dendritic Ca 2+ spike in a single compartment model of a PC dendritic segment with Ca 2+ channels of P and Ttype and Ca 2+ activated K + channels of BK and SKtype. The Ca 2+ dynamics models used are (1) a single Ca 2+ pool; (2) two Ca 2+ pools, respectively, for the fast and slow transients; (3) detailed Ca 2+ dynamics with buffers, pump, and diffusion; and (4) detailed Ca 2+ dynamics with buffers, pump, and diffusion compensation. Our results show that detailed Ca 2+ dynamics models have significantly better control over Ca 2+ activated K + channels and lead to physiologically more realistic simulations of Ca 2+ spikes and bursting. Furthermore, the compensating mechanism largely eliminates the effect of removing diffusion from the model on Ca 2+ dynamics over multiple time scales. Abstract This paper describes the capabilities of DISCO, an extensible approach that supports integrative Webbased information dissemination. DISCO is a component of the Neuroscience Information Framework (NIF), an NIH Neuroscience Blueprint initiative that facilitates integrated access to diverse neuroscience resources via the Internet. DISCO facilitates the automated maintenance of several distinct capabilities using a collection of files 1) that are maintained locally by the developers of participating neuroscience resources and 2) that are “harvested” on a regular basis by a central DISCO server. This approach allows central NIF capabilities to be updated as each resource’s content changes over time. DISCO currently supports the following capabilities: 1) resource descriptions, 2) “LinkOut” to a resource’s data items from NCBI Entrez resources such as PubMed, 3) Webbased interoperation with a resource, 4) sharing a resource’s lexicon and ontology, 5) sharing a resource’s database schema, and 6) participation by the resource in neurosciencerelated RSS news dissemination. The developers of a resource are free to choose which DISCO capabilities their resource will participate in. Although DISCO is used by NIF to facilitate neuroscience data integration, its capabilities have general applicability to other areas of research. BioModels Database: An enhanced, curated and annotated resource for published quantitative kinetic models BMC Systems Biology Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. MorphML: Level 1 of the NeuroML Standards for Neuronal Morphology Data and Model Specification Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Text mining neuroscience journal articles to populate neuroscience databases Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. BioModels Database: a free, centralized database of curated, published, quantitative kinetic models of biochemical and cellular systems. Nucleic acids research BioModels Database (http://www.ebi.ac.uk/biomodels/), part of the international initiative BioModels.net, provides access to published, peer-reviewed, quantitative models of biochemical and cellular systems. Each model is carefully curated to verify that it corresponds to the reference publication and gives the proper numerical results. Curators also annotate the components of the models with terms from controlled vocabularies and links to other relevant data resources. This allows the users to search accurately for the models they need. The models can currently be retrieved in the SBML format, and import/export facilities are being developed to extend the spectrum of formats supported by the resource. Biochemical Phenomena;Cell Physiological Phenomena;Databases, Factual;Genes;Internet;Kinetics;Models, Biological;User-Computer Interface;Vocabulary, Controlled A novel learning rule for long-term plasticity of short-term synaptic plasticity enhances temporal processing. Frontiers in integrative neuroscience It is well established that short-term synaptic plasticity (STP) of neocortical synapses is itself plastic - e.g., the induction of LTP and LTD tend to shift STP towards short-term depression and facilitation, respectively. What has not been addressed theoretically or experimentally is whether STP is "learned"; that is, is STP regulated by specific learning rules that are in place to optimize the computations performed at synapses, or, are changes in STP essentially an epiphenomenon of long-term plasticity? Here we propose that STP is governed by specific learning rules that operate independently and in parallel of the associative learning rules governing baseline synaptic strength. We describe a learning rule for STP and, using simulations, demonstrate that it significantly enhances the discrimination of spatiotemporal stimuli. Additionally we generate a set of experimental predictions aimed at testing our hypothesis. Temporal sensitivity of protein kinase a activation in late-phase long term potentiation. PLoS computational biology Protein kinases play critical roles in learning and memory and in long term potentiation (LTP), a form of synaptic plasticity. The induction of late-phase LTP (L-LTP) in the CA1 region of the hippocampus requires several kinases, including CaMKII and PKA, which are activated by calcium-dependent signaling processes and other intracellular signaling pathways. The requirement for PKA is limited to L-LTP induced using spaced stimuli, but not massed stimuli. To investigate this temporal sensitivity of PKA, a computational biochemical model of L-LTP induction in CA1 pyramidal neurons was developed. The model describes the interactions of calcium and cAMP signaling pathways and is based on published biochemical measurements of two key synaptic signaling molecules, PKA and CaMKII. The model is stimulated using four 100 Hz tetani separated by 3 sec (massed) or 300 sec (spaced), identical to experimental L-LTP induction protocols. Simulations show that spaced stimulation activates more PKA than massed stimulation, and makes a key experimental prediction, that L-LTP is PKA-dependent for intervals larger than 60 sec. Experimental measurements of L-LTP demonstrate that intervals of 80 sec, but not 40 sec, produce PKA-dependent L-LTP, thereby confirming the model prediction. Examination of CaMKII reveals that its temporal sensitivity is opposite that of PKA, suggesting that PKA is required after spaced stimulation to compensate for a decrease in CaMKII. In addition to explaining the temporal sensitivity of PKA, these simulations suggest that the use of several kinases for memory storage allows each to respond optimally to different temporal patterns. Animals;CA1 Region, Hippocampal;Calcium;Calcium-Calmodulin-Dependent Protein Kinase Type 2;Computational Biology;Computer Simulation;Cyclic AMP;Cyclic AMP-Dependent Protein Kinases;Dopamine;Enzyme Activation;Long-Term Potentiation;Mice;Models, Neurological;Signal Transduction;Time Factors A hybrid human and machine resource curation pipeline for the Neuroscience Information Framework. Database : the journal of biological databases and curation The breadth of information resources available to researchers on the Internet continues to expand, particularly in light of recently implemented data-sharing policies required by funding agencies. However, the nature of dense, multifaceted neuroscience data and the design of contemporary search engine systems makes efficient, reliable and relevant discovery of such information a significant challenge. This challenge is specifically pertinent for online databases, whose dynamic content is 'hidden' from search engines. The Neuroscience Information Framework (NIF; http://www.neuinfo.org) was funded by the NIH Blueprint for Neuroscience Research to address the problem of finding and utilizing neuroscience-relevant resources such as software tools, data sets, experimental animals and antibodies across the Internet. From the outset, NIF sought to provide an accounting of available resources, whereas developing technical solutions to finding, accessing and utilizing them. The curators therefore, are tasked with identifying and registering resources, examining data, writing configuration files to index and display data and keeping the contents current. In the initial phases of the project, all aspects of the registration and curation processes were manual. However, as the number of resources grew, manual curation became impractical. This report describes our experiences and successes with developing automated resource discovery and semiautomated type characterization with text-mining scripts that facilitate curation team efforts to discover, integrate and display new content. We also describe the DISCO framework, a suite of automated web services that significantly reduce manual curation efforts to periodically check for resource updates. Lastly, we discuss DOMEO, a semi-automated annotation tool that improves the discovery and curation of resources that are not necessarily website-based (i.e. reagents, software tools). Although the ultimate goal of automation was to reduce the workload of the curators, it has resulted in valuable analytic by-products that address accessibility, use and citation of resources that can now be shared with resource owners and the larger scientific community. DATABASE URL: http://neuinfo.org. Abstracting and Indexing as Topic;Computational Biology;Database Management Systems;Databases, Factual;Humans;Neurosciences;Software Moyamoya disease-associated protein mysterin/RNF213 is a novel AAA+ ATPase, which dynamically changes its oligomeric state Scientific Reports Moyamoya disease is an idiopathic human cerebrovascular disorder that is characterized by progressive stenosis and abnormal collateral vessels. We recently identified mysterin/RNF213 as its first susceptibility gene, which encodes a 591-kDa protein containing enzymatically active P-loop ATPase and ubiquitin ligase domains and is involved in proper vascular development in zebrafish. Here we demonstrate that mysterin further contains two tandem AAA+ ATPase modules and forms huge ring-shaped oligomeric complex. AAA+ ATPases are known to generally mediate various biophysical and mechanical processes with the characteristic ring-shaped structure. Fluorescence correlation spectroscopy and biochemical evaluation suggested that mysterin dynamically changes its oligomeric forms through ATP/ADP binding and hydrolysis cycles. Thus, the moyamoya disease-associated gene product is a unique protein that functions as ubiquitin ligase and AAA+ ATPase, which possibly contributes to vascular development through mechanical processes in the cell. Colocalization of protein kinase A with adenylyl cyclase enhances protein kinase A activity during induction of long-lasting long-term-potentiation. PLoS computational biology The ability of neurons to differentially respond to specific temporal and spatial input patterns underlies information storage in neural circuits. One means of achieving spatial specificity is to restrict signaling molecules to particular subcellular compartments using anchoring molecules such as A-Kinase Anchoring Proteins (AKAPs). Disruption of protein kinase A (PKA) anchoring to AKAPs impairs a PKA-dependent form of long term potentiation (LTP) in the hippocampus. To investigate the role of localized PKA signaling in LTP, we developed a stochastic reaction-diffusion model of the signaling pathways leading to PKA activation in CA1 pyramidal neurons. Simulations investigated whether the role of anchoring is to locate kinases near molecules that activate them, or near their target molecules. The results show that anchoring PKA with adenylyl cyclase (which produces cAMP that activates PKA) produces significantly greater PKA activity, and phosphorylation of both inhibitor-1 and AMPA receptor GluR1 subunit on S845, than when PKA is anchored apart from adenylyl cyclase. The spatial microdomain of cAMP was smaller than that of PKA suggesting that anchoring PKA near its source of cAMP is critical because inactivation by phosphodiesterase limits diffusion of cAMP. The prediction that the role of anchoring is to colocalize PKA near adenylyl cyclase was confirmed by experimentally rescuing the deficit in LTP produced by disruption of PKA anchoring using phosphodiesterase inhibitors. Additional experiments confirm the model prediction that disruption of anchoring impairs S845 phosphorylation produced by forskolin-induced synaptic potentiation. Collectively, these results show that locating PKA near adenylyl cyclase is a critical function of anchoring. A Kinase Anchor Proteins;Adenylate Cyclase;Animals;CA1 Region, Hippocampal;Calcium;Colforsin;Computer Simulation;Cyclic AMP;Cyclic AMP-Dependent Protein Kinases;Diffusion;Dopamine;Long-Term Potentiation;Mice;Models, Biological;Proteins;Pyramidal Cells;Stochastic Processes;Substrate Specificity;Synaptic Potentials Modeling-independent elucidation of inactivation pathways in recombinant and native A-type Kv channels. The Journal of general physiology A-type voltage-gated K(+) (Kv) channels self-regulate their activity by inactivating directly from the open state (open-state inactivation [OSI]) or by inactivating before they open (closed-state inactivation [CSI]). To determine the inactivation pathways, it is often necessary to apply several pulse protocols, pore blockers, single-channel recording, and kinetic modeling. However, intrinsic hurdles may preclude the standardized application of these methods. Here, we implemented a simple method inspired by earlier studies of Na(+) channels to analyze macroscopic inactivation and conclusively deduce the pathways of inactivation of recombinant and native A-type Kv channels. We investigated two distinct A-type Kv channels expressed heterologously (Kv3.4 and Kv4.2 with accessory subunits) and their native counterparts in dorsal root ganglion and cerebellar granule neurons. This approach applies two conventional pulse protocols to examine inactivation induced by (a) a simple step (single-pulse inactivation) and (b) a conditioning step (double-pulse inactivation). Consistent with OSI, the rate of Kv3.4 inactivation (i.e., the negative first derivative of double-pulse inactivation) precisely superimposes on the profile of the Kv3.4 current evoked by a single pulse because the channels must open to inactivate. In contrast, the rate of Kv4.2 inactivation is asynchronous, already changing at earlier times relative to the profile of the Kv4.2 current evoked by a single pulse. Thus, Kv4.2 inactivation occurs uncoupled from channel opening, indicating CSI. Furthermore, the inactivation time constant versus voltage relation of Kv3.4 decreases monotonically with depolarization and levels off, whereas that of Kv4.2 exhibits a J-shape profile. We also manipulated the inactivation phenotype by changing the subunit composition and show how CSI and CSI combined with OSI might affect spiking properties in a full computational model of the hippocampal CA1 neuron. This work unambiguously elucidates contrasting inactivation pathways in neuronal A-type Kv channels and demonstrates how distinct pathways might impact neurophysiological activity. Animals;Ion Channel Gating;Kinetics;Male;Membrane Potentials;Neurons;Protein Subunits;Rats;Rats, Sprague-Dawley;Recombinant Proteins;Shal Potassium Channels;Shaw Potassium Channels;Xenopus Basic Concepts in Population Modeling, Simulation, and Model-Based Drug Development CPT: Pharmacometrics & Systems Pharmacology Modeling is an important tool in drug development; population modeling is a complex process requiring robust underlying procedures for ensuring clean data, appropriate computing platforms, adequate resources, and effective communication. Although requiring an investment in resources, it can save time and money by providing a platform for integrating all information gathered on new therapeutic agents. This article provides a brief overview of aspects of modeling and simulation as applied to many areas in drug development. CPT: Pharmacometrics & Systems Pharmacology (2012) 1, e6; doi:10.1038/psp.2012.4; advance online publication 26 September 2012 Identification and insertion of 3-carbon bridges in protein disulfide bonds: a computational approach Nature Protocols More than 42,000 3D structures of proteins are available on the Internet. We have shown that the chemical insertion of a 3-carbon bridge across the native disulfide bond of a protein or peptide can enable the site-specific conjugation of PEG to the protein without a loss of its structure or function. For success, it is necessary to select an appropriate and accessible disulfide bond in the protein for this chemical modification. We describe how to use public protein databases and molecular modeling programs to select a protein rationally and to identify the optimum disulfide bond for experimental studies. Our computational approach can substantially reduce the time required for the laboratory-based chemical modification. Identification of solvent-accessible disulfides using published structural information takes approximately 2 h. Predicting the structural effects of the disulfide-based modification can take 3 weeks. Network bursting dynamics in excitatory cortical neuron cultures results from the combination of different adaptive mechanisms. PloS one In the brain, synchronization among cells of an assembly is a common phenomenon, and thought to be functionally relevant. Here we used an in vitro experimental model of cell assemblies, cortical cultures, combined with numerical simulations of a spiking neural network (SNN) to investigate how and why spontaneous synchronization occurs. In order to deal with excitation only, we pharmacologically blocked GABAAergic transmission using bicuculline. Synchronous events in cortical cultures tend to involve almost every cell and to display relatively constant durations. We have thus named these "network spikes" (NS). The inter-NS-intervals (INSIs) proved to be a more interesting phenomenon. In most cortical cultures NSs typically come in series or bursts ("bursts of NSs", BNS), with short (~1 s) INSIs and separated by long silent intervals (tens of s), which leads to bimodal INSI distributions. This suggests that a facilitating mechanism is at work, presumably short-term synaptic facilitation, as well as two fatigue mechanisms: one with a short timescale, presumably short-term synaptic depression, and another one with a longer timescale, presumably cellular adaptation. We thus incorporated these three mechanisms into the SNN, which, indeed, produced realistic BNSs. Next, we systematically varied the recurrent excitation for various adaptation timescales. Strong excitability led to frequent, quasi-periodic BNSs (CV~0), and weak excitability led to rare BNSs, approaching a Poisson process (CV~1). Experimental cultures appear to operate within an intermediate weakly-synchronized regime (CV~0.5), with an adaptation timescale in the 2-8 s range, and well described by a Poisson-with-refractory-period model. Taken together, our results demonstrate that the INSI statistics are indeed informative: they allowed us to infer the mechanisms at work, and many parameters that we cannot access experimentally. A journey to Semantic Web query federation in the life sciences. BMC bioinformatics BACKGROUND: As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. METHODS AND RESULTS: We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. CONCLUSION: We have demonstrated the use of a set of novel and state-of-the-art Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semantically-equivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Biological Science Disciplines;Computational Biology;Databases, Factual;Information Dissemination;Information Storage and Retrieval;Internet;Semantics Mutation of fibulin-1 causes a novel syndrome involving the central nervous system and connective tissues European Journal of Human Genetics Fibulin-1 is an extracellular matrix protein that has an important role in the structure of elastic fibers and basement membranes of various tissues. Using homozygosity mapping and exome sequencing, we discovered a missense mutation, p.(Cys397Phe), in fibulin-1 in three patients from a consanguineous family presented with a novel syndrome of syndactyly, undescended testes, delayed motor milestones, mental retardation and signs of brain atrophy. The mutation discovered segregated with the phenotype and was not found in 374 population-matched alleles. The affected cysteine is highly conserved across vertebrates and its mutation is predicted to abolish a disulfide bond that defines the tertiary structure of fibulin-1. Our findings emphasize the crucial role fibulin-1 has in development of the central nervous system and various connective tissues. European Journal of Human Genetics advance online publication, 2 October 2013; doi:10.1038/ejhg.2013.210 Linogliride fumarate, representing a new class of oral hypoglycemic agent for diabetes Clinical Pharmacology and Therapeutics This study presents the first multiday therapy trial of linogliride fumarate, a representative of a new class of oral hypoglycemie agents. Linogliride demonstrated a significant hypoglycémie activity in 26 patients with non-insulin-dependent diabetes mellitus receiving 1 week of therapy. In a dose range of 150 to 400 mg b.i.d., fasting glucose levels fell from 237 ± 52 mg to 199 ± 59 mg by day 7 (P A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON. Frontiers in neuroinformatics As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. The refined structure of nascent HDL reveals a key functional domain for particle maturation and dysfunction Nature Structural & Molecular Biology The cardioprotective function of high-density lipoprotein (HDL) is largely attributed to its ability to facilitate transport of cholesterol from peripheral tissues to the liver. However, HDL may become dysfunctional through oxidative modification, impairing cellular cholesterol efflux. Here we report a refined molecular model of nascent discoidal HDL, determined using hydrogen-deuterium exchange mass spectrometry. The model reveals two apolipoprotein A1 (apoA1) molecules arranged in an antiparallel double-belt structure, with residues 159–180 of each apoA1 forming a protruding solvent-exposed loop. We further show that this loop, including Tyr166, a preferred target for site-specific oxidative modification within atheroma, directly interacts with and activates lecithin cholesterol acyl transferase. These studies identify previously uncharacterized structural features of apoA1 in discoidal HDL that are crucial for particle maturation, and elucidate a structural and molecular mechanism for generating a dysfunctional form of HDL in atherosclerosis. Growth/differentiation factor-15: prostate cancer suppressor or promoter? Prostate Cancer and Prostatic Diseases Deregulation of expression and function of cytokines belonging to the transforming growth factor-β (TGF-β) family is often associated with various pathologies. For example, this cytokine family has been considered a promising target for cancer therapy. However, the detailed functions of several cytokines from the TGF-β family that could have a role in cancer progression and therapy remain unclear. One of these molecules is growth/differentiation factor-15 (GDF-15), a divergent member of the TGF-β family. This stress-induced cytokine has been proposed to possess immunomodulatory functions and its high expression is often associated with cancer progression, including prostate cancer (PCa). However, studies clearly demonstrating the mechanisms for signal transduction and functions in cell interaction, cancer progression and therapy are still lacking. New GDF-15 roles have recently been identified for modulating osteoclast differentiation and for therapy for PCa bone metastases. Moreover, GDF-15 is as an abundant cytokine in seminal plasma with immunosuppressive properties. We discuss studies that focus on the regulation of GDF-15 expression and its role in tissue homeostasis, repair and the immune response with an emphasis on the role in PCa development. Local field potential modeling predicts dense activation in cerebellar granule cells clusters under LTP and LTD control. PloS one Local field-potentials (LFPs) are generated by neuronal ensembles and contain information about the activity of single neurons. Here, the LFPs of the cerebellar granular layer and their changes during long-term synaptic plasticity (LTP and LTD) were recorded in response to punctate facial stimulation in the rat in vivo. The LFP comprised a trigeminal (T) and a cortical (C) wave. T and C, which derived from independent granule cell clusters, co-varied during LTP and LTD. To extract information about the underlying cellular activities, the LFP was reconstructed using a repetitive convolution (ReConv) of the extracellular potential generated by a detailed multicompartmental model of the granule cell. The mossy fiber input patterns were determined using a Blind Source Separation (BSS) algorithm. The major component of the LFP was generated by the granule cell spike Na(+) current, which caused a powerful sink in the axon initial segment with the source located in the soma and dendrites. Reproducing the LFP changes observed during LTP and LTD required modifications in both release probability and intrinsic excitability at the mossy fiber-granule cells relay. Synaptic plasticity and Golgi cell feed-forward inhibition proved critical for controlling the percentage of active granule cells, which was 11% in standard conditions but ranged from 3% during LTD to 21% during LTP and raised over 50% when inhibition was reduced. The emerging picture is that of independent (but neighboring) trigeminal and cortical channels, in which synaptic plasticity and feed-forward inhibition effectively regulate the number of discharging granule cells and emitted spikes generating "dense" activity clusters in the cerebellar granular layer. Action Potentials;Algorithms;Animals;Cell Aggregation;Cerebellum;Computer Simulation;Cytoplasmic Granules;Long-Term Potentiation;Long-Term Synaptic Depression;Models, Neurological;Mossy Fibers, Hippocampal;Neural Pathways;Rats Modeling highly pathogenic avian influenza transmission in wild birds and poultry in West Bengal, India Scientific Reports Wild birds are suspected to have played a role in highly pathogenic avian influenza (HPAI) H5N1 outbreaks in West Bengal. Cluster analysis showed that H5N1 was introduced in West Bengal at least 3 times between 2008 and 2010. We simulated the introduction of H5N1 by wild birds and their contact with poultry through a stochastic continuous-time mathematical model. Results showed that reducing contact between wild birds and domestic poultry, and increasing the culling rate of infected domestic poultry communities will reduce the probability of outbreaks. Poultry communities that shared habitat with wild birds or those indistricts with previous outbreaks were more likely to suffer an outbreak. These results indicate that wild birds can introduce HPAI to domestic poultry and that limiting their contact at shared habitats together with swift culling of infected domestic poultry can greatly reduce the likelihood of HPAI outbreaks. PEGylation of native disulfide bonds in proteins Nature Protocols PEGylation has turned proteins into important new biopharmaceuticals. The fundamental problems with the existing approaches to PEGylation are inefficient conjugation and the formation of heterogeneous mixtures. This is because poly(ethylene glycol) (PEG) is usually conjugated to nucleophilic amine residues. Our PEGylation protocol solves these problems by exploiting the chemical reactivity of both of the sulfur atoms in the disulfide bond of many biologically relevant proteins. An accessible disulfide bond is mildly reduced to liberate the two cysteine sulfur atoms without disturbing the protein's tertiary structure. Site-specific PEGylation is achieved with a bis-thiol alkylating PEG reagent that sequentially undergoes conjugation to form a three-carbon bridge. The two sulfur atoms are re-linked with PEG selectively conjugated to the bridge. PEGylation of a protein can be completed in 24 h and purification of the PEG-protein conjugate in another 3 h. We have successfully applied this approach to PEGylation of cytokines, enzymes, antibody fragments and peptides, without destroying their tertiary structure or abolishing their biological activity. Detecting differential usage of exons from RNA-Seq data Smoothing of, and parameter estimation from, noisy biophysical recordings. PLoS computational biology Biophysically detailed models of single cells are difficult to fit to real data. Recent advances in imaging techniques allow simultaneous access to various intracellular variables, and these data can be used to significantly facilitate the modelling task. These data, however, are noisy, and current approaches to building biophysically detailed models are not designed to deal with this. We extend previous techniques to take the noisy nature of the measurements into account. Sequential Monte Carlo ("particle filtering") methods, in combination with a detailed biophysical description of a cell, are used for principled, model-based smoothing of noisy recording data. We also provide an alternative formulation of smoothing where the neural nonlinearities are estimated in a non-parametric manner. Biophysically important parameters of detailed models (such as channel densities, intercompartmental conductances, input resistances, and observation noise) are inferred automatically from noisy data via expectation-maximization. Overall, we find that model-based smoothing is a powerful, robust technique for smoothing of noisy biophysical data and for inference of biophysical parameters in the face of recording noise. Algorithms;Artificial Intelligence;Biophysical Processes;Cell Physiological Phenomena;Computational Biology;Computer Simulation;Electrophysiological Phenomena;Image Processing, Computer-Assisted;Models, Biological;Monte Carlo Method;Statistics, Nonparametric Know your current I(h): interaction with a shunting current explains the puzzling effects of its pharmacological or pathological modulations. PloS one The non-specific, hyperpolarization activated, I(h) current is particularly involved in epilepsy and it exhibits an excitatory or inhibitory action on synaptic integration in an apparently inconsistent way. It has been suggested that most of the inconsistencies could be reconciled invoking an indirect interaction with the M-type K(+) current, another current involved in epilepsy. However, here we show that the original experiments, and the simplified model used to explain and support them, cannot explain in a conclusive way the puzzling I(h) actions observed in different experimental preparations. Using a realistic model, we show instead how and why a shunting current, such as that carried by TASK-like channels, and dependent on I(h) channel is able to explain virtually all experimental findings on I(h) up- or down-regulation by modulators or pathological conditions. The model results suggest several experimentally testable predictions to characterize in more details this elusive and peculiar interaction, which may be of fundamental importance in the development of new treatments for all those pathological and cognitive dysfunctions caused, mediated, or affected by I(h). Action Potentials;Animals;CA1 Region, Hippocampal;Computer Simulation;Electrophysiological Phenomena;Epilepsy;Humans;Models, Neurological;Potassium Channels;Pyrimidines;Triazines Cortical plasticity induced by spike-triggered microstimulation in primate somatosensory cortex. PloS one Electrical stimulation of the nervous system for therapeutic purposes, such as deep brain stimulation in the treatment of Parkinson's disease, has been used for decades. Recently, increased attention has focused on using microstimulation to restore functions as diverse as somatosensation and memory. However, how microstimulation changes the neural substrate is still not fully understood. Microstimulation may cause cortical changes that could either compete with or complement natural neural processes, and could result in neuroplastic changes rendering the region dysfunctional or even epileptic. As part of our efforts to produce neuroprosthetic devices and to further study the effects of microstimulation on the cortex, we stimulated and recorded from microelectrode arrays in the hand area of the primary somatosensory cortex (area 1) in two awake macaque monkeys. We applied a simple neuroprosthetic microstimulation protocol to a pair of electrodes in the area 1 array, using either random pulses or pulses time-locked to the recorded spiking activity of a reference neuron. This setup was replicated using a computer model of the thalamocortical system, which consisted of 1980 spiking neurons distributed among six cortical layers and two thalamic nuclei. Experimentally, we found that spike-triggered microstimulation induced cortical plasticity, as shown by increased unit-pair mutual information, while random microstimulation did not. In addition, there was an increased response to touch following spike-triggered microstimulation, along with decreased neural variability. The computer model successfully reproduced both qualitative and quantitative aspects of the experimental findings. The physiological findings of this study suggest that even simple microstimulation protocols can be used to increase somatosensory information flow. Animals;Brain Mapping;Computer Simulation;Electric Stimulation;Female;Macaca;Male;Microelectrodes;Neuronal Plasticity;Neurons;Principal Component Analysis;Signal Processing, Computer-Assisted;Somatosensory Cortex;Touch Interaction of NMDA receptor and pacemaking mechanisms in the midbrain dopaminergic neuron. PloS one Dopamine neurotransmission has been found to play a role in addictive behavior and is altered in psychiatric disorders. Dopaminergic (DA) neurons display two functionally distinct modes of electrophysiological activity: low- and high-frequency firing. A puzzling feature of the DA neuron is the following combination of its responses: N-methyl-D-aspartate receptor (NMDAR) activation evokes high-frequency firing, whereas other tonic excitatory stimuli (α-amino-3-hydroxyl-5-methyl-4-isoxazolepropionate receptor (AMPAR) activation or applied depolarization) block firing instead. We suggest a new computational model that reproduces this combination of responses and explains recent experimental data. Namely, somatic NMDAR stimulation evokes high-frequency firing and is more effective than distal dendritic stimulation. We further reduce the model to a single compartment and analyze the mechanism of the distinct high-frequency response to NMDAR activation vs. other stimuli. Standard nullcline analysis shows that the mechanism is based on a decrease in the amplitude of calcium oscillations. The analysis confirms that the nonlinear voltage dependence provided by the magnesium block of the NMDAR determine its capacity to elevate the firing frequency. We further predict that the moderate slope of the voltage dependence plays the central role in the frequency elevation. Additionally, we suggest a repolarizing current that sustains calcium-independent firing or firing in the absence of calcium-dependent repolarizing currents. We predict that the ether-a-go-go current (ERG), which has been observed in the DA neuron, is the best fit for this critical role. We show that a calcium-dependent and a calcium-independent oscillatory mechanisms form a structure of interlocked negative feedback loops in the DA neuron. The structure connects research of DA neuron firing with circadian biology and determines common minimal models for investigation of robustness of oscillations, which is critical for normal function of both systems. Action Potentials;Algorithms;Calcium Signaling;Computer Simulation;Dopaminergic Neurons;Ether-A-Go-Go Potassium Channels;Mesencephalon;Models, Neurological;Receptors, N-Methyl-D-Aspartate Correlated conductance parameters in leech heart motor neurons contribute to motor pattern formation. PloS one Neurons can have widely differing intrinsic membrane properties, in particular the density of specific conductances, but how these contribute to characteristic neuronal activity or pattern formation is not well understood. To explore the relationship between conductances, and in particular how they influence the activity of motor neurons in the well characterized leech heartbeat system, we developed a new multi-compartmental Hodgkin-Huxley style leech heart motor neuron model. To do so, we evolved a population of model instances, which differed in the density of specific conductances, capable of achieving specific output activity targets given an associated input pattern. We then examined the sensitivity of measures of output activity to conductances and how the model instances responded to hyperpolarizing current injections. We found that the strengths of many conductances, including those with differing dynamics, had strong partial correlations and that these relationships appeared to be linked by their influence on heart motor neuron activity. Conductances that had positive correlations opposed one another and had the opposite effects on activity metrics when perturbed whereas conductances that had negative correlations could compensate for one another and had similar effects on activity metrics. Vision egg: an open-source library for realtime visual stimulus generation. Frontiers in neuroinformatics Modern computer hardware makes it possible to produce visual stimuli in ways not previously possible. Arbitrary scenes, from traditional sinusoidal gratings to naturalistic 3D scenes can now be specified on a frame-by-frame basis in realtime. A programming library called the Vision Egg that aims to make it easy to take advantage of these innovations. The Vision Egg is a free, open-source library making use of OpenGL and written in the high-level language Python with extensions in C. Careful attention has been paid to the issues of luminance and temporal calibration, and several interfacing techniques to input devices such as mice, movement tracking systems, and digital triggers are discussed. Together, these make the Vision Egg suitable for many psychophysical, electrophysiological, and behavioral experiments. This software is available for free download at visionegg.org. Reaction-diffusion in the NEURON simulator. Frontiers in neuroinformatics In order to support research on the role of cell biological principles (genomics, proteomics, signaling cascades and reaction dynamics) on the dynamics of neuronal response in health and disease, NEURON's Reaction-Diffusion (rxd) module in Python provides specification and simulation for these dynamics, coupled with the electrophysiological dynamics of the cell membrane. Arithmetic operations on species and parameters are overloaded, allowing arbitrary reaction formulas to be specified using Python syntax. These expressions are then transparently compiled into bytecode that uses NumPy for fast vectorized calculations. At each time step, rxd combines NEURON's integrators with SciPy's sparse linear algebra library. Dampening of hyperexcitability in CA1 pyramidal neurons by polyunsaturated fatty acids acting on voltage-gated ion channels. PloS one A ketogenic diet is an alternative treatment of epilepsy in infants. The diet, rich in fat and low in carbohydrates, elevates the level of polyunsaturated fatty acids (PUFAs) in plasma. These substances have therefore been suggested to contribute to the anticonvulsive effect of the diet. PUFAs modulate the properties of a range of ion channels, including K and Na channels, and it has been hypothesized that these changes may be part of a mechanistic explanation of the ketogenic diet. Using computational modelling, we here study how experimentally observed PUFA-induced changes of ion channel activity affect neuronal excitability in CA1, in particular responses to synaptic input of high synchronicity. The PUFA effects were studied in two pathological models of cellular hyperexcitability associated with epileptogenesis. We found that experimentally derived PUFA modulation of the A-type K (K(A)) channel, but not the delayed-rectifier K channel, restored healthy excitability by selectively reducing the response to inputs of high synchronicity. We also found that PUFA modulation of the transient Na channel was effective in this respect if the channel's steady-state inactivation was selectively affected. Furthermore, PUFA-induced hyperpolarization of the resting membrane potential was an effective approach to prevent hyperexcitability. When the combined effect of PUFA on the K(A) channel, the Na channel, and the resting membrane potential, was simulated, a lower concentration of PUFA was needed to restore healthy excitability. We therefore propose that one explanation of the beneficial effect of PUFAs lies in its simultaneous action on a range of ion-channel targets. Furthermore, this work suggests that a pharmacological cocktail acting on the voltage dependence of the Na-channel inactivation, the voltage dependences of K(A) channels, and the resting potential can be an effective treatment of epilepsy. CA1 Region, Hippocampal;Computer Simulation;Epilepsy;Fatty Acids, Unsaturated;Humans;Infant;Ion Channel Gating;Membrane Potentials;Models, Biological;Neurons;Patch-Clamp Techniques;Potassium Channels, Voltage-Gated;Synapses;Voltage-Gated Sodium Channels The developmental dynamics of the maize leaf transcriptome Nature Genetics We have analyzed the maize leaf transcriptome using Illumina sequencing. We mapped more than 120 million reads to define gene structure and alternative splicing events and to quantify transcript abundance along a leaf developmental gradient and in mature bundle sheath and mesophyll cells. We detected differential mRNA processing events for most maize genes. We found that 64% and 21% of genes were differentially expressed along the developmental gradient and between bundle sheath and mesophyll cells, respectively. We implemented Gbrowse, an electronic fluorescent pictograph browser, and created a two-cell biochemical pathway viewer to visualize datasets. Cluster analysis of the data revealed a dynamic transcriptome, with transcripts for primary cell wall and basic cellular metabolism at the leaf base transitioning to transcripts for secondary cell wall biosynthesis and C4 photosynthetic development toward the tip. This dataset will serve as the foundation for a systems biology approach to the understanding of photosynthetic development. Subcellular location of PKA controls striatal plasticity: stochastic simulations in spiny dendrites. PLoS computational biology Dopamine release in the striatum has been implicated in various forms of reward dependent learning. Dopamine leads to production of cAMP and activation of protein kinase A (PKA), which are involved in striatal synaptic plasticity and learning. PKA and its protein targets are not diffusely located throughout the neuron, but are confined to various subcellular compartments by anchoring molecules such as A-Kinase Anchoring Proteins (AKAPs). Experiments have shown that blocking the interaction of PKA with AKAPs disrupts its subcellular location and prevents LTP in the hippocampus and striatum; however, these experiments have not revealed whether the critical function of anchoring is to locate PKA near the cAMP that activates it or near its targets, such as AMPA receptors located in the post-synaptic density. We have developed a large scale stochastic reaction-diffusion model of signaling pathways in a medium spiny projection neuron dendrite with spines, based on published biochemical measurements, to investigate this question and to evaluate whether dopamine signaling exhibits spatial specificity post-synaptically. The model was stimulated with dopamine pulses mimicking those recorded in response to reward. Simulations show that PKA colocalization with adenylate cyclase, either in the spine head or in the dendrite, leads to greater phosphorylation of DARPP-32 Thr34 and AMPA receptor GluA1 Ser845 than when PKA is anchored away from adenylate cyclase. Simulations further demonstrate that though cAMP exhibits a strong spatial gradient, diffusible DARPP-32 facilitates the spread of PKA activity, suggesting that additional inactivation mechanisms are required to produce spatial specificity of PKA activity. Animals;Computational Biology;Computer Simulation;Cyclic AMP;Cyclic AMP-Dependent Protein Kinases;Dendritic Spines;Dopamine;Dopamine and cAMP-Regulated Phosphoprotein 32;Humans;Intracellular Space;Models, Neurological;Monte Carlo Method;Neuronal Plasticity;Reproducibility of Results;Signal Transduction Ih tunes theta/gamma oscillations and cross-frequency coupling in an in silico CA3 model. PloS one Ih channels are uniquely positioned to act as neuromodulatory control points for tuning hippocampal theta (4-12 Hz) and gamma (25 Hz) oscillations, oscillations which are thought to have importance for organization of information flow. contributes to neuronal membrane resonance and resting membrane potential, and is modulated by second messengers. We investigated oscillatory control using a multiscale computer model of hippocampal CA3, where each cell class (pyramidal, basket, and oriens-lacunosum moleculare cells), contained type-appropriate isoforms of . Our model demonstrated that modulation of pyramidal and basket allows tuning theta and gamma oscillation frequency and amplitude. Pyramidal also controlled cross-frequency coupling (CFC) and allowed shifting gamma generation towards particular phases of the theta cycle, effected via 's ability to set pyramidal excitability. Our model predicts that in vivo neuromodulatory control of allows flexibly controlling CFC and the timing of gamma discharges at particular theta phases. Assimilating seizure dynamics. PLoS computational biology Observability of a dynamical system requires an understanding of its state-the collective values of its variables. However, existing techniques are too limited to measure all but a small fraction of the physical variables and parameters of neuronal networks. We constructed models of the biophysical properties of neuronal membrane, synaptic, and microenvironment dynamics, and incorporated them into a model-based predictor-controller framework from modern control theory. We demonstrate that it is now possible to meaningfully estimate the dynamics of small neuronal networks using as few as a single measured variable. Specifically, we assimilate noisy membrane potential measurements from individual hippocampal neurons to reconstruct the dynamics of networks of these cells, their extracellular microenvironment, and the activities of different neuronal types during seizures. We use reconstruction to account for unmeasured parts of the neuronal system, relating micro-domain metabolic processes to cellular excitability, and validate the reconstruction of cellular dynamical interactions against actual measurements. Data assimilation, the fusing of measurement with computational models, has significant potential to improve the way we observe and understand brain dynamics. Animals;CA1 Region, Hippocampal;Intracellular Membranes;Membrane Potentials;Models, Neurological;Neural Networks (Computer);Patch-Clamp Techniques;Potassium;Rats;Reproducibility of Results;Seizures;Sodium Sparse distributed representation of odors in a large-scale olfactory bulb circuit. PLoS computational biology In the olfactory bulb, lateral inhibition mediated by granule cells has been suggested to modulate the timing of mitral cell firing, thereby shaping the representation of input odorants. Current experimental techniques, however, do not enable a clear study of how the mitral-granule cell network sculpts odor inputs to represent odor information spatially and temporally. To address this critical step in the neural basis of odor recognition, we built a biophysical network model of mitral and granule cells, corresponding to 1/100th of the real system in the rat, and used direct experimental imaging data of glomeruli activated by various odors. The model allows the systematic investigation and generation of testable hypotheses of the functional mechanisms underlying odor representation in the olfactory bulb circuit. Specifically, we demonstrate that lateral inhibition emerges within the olfactory bulb network through recurrent dendrodendritic synapses when constrained by a range of balanced excitatory and inhibitory conductances. We find that the spatio-temporal dynamics of lateral inhibition plays a critical role in building the glomerular-related cell clusters observed in experiments, through the modulation of synaptic weights during odor training. Lateral inhibition also mediates the development of sparse and synchronized spiking patterns of mitral cells related to odor inputs within the network, with the frequency of these synchronized spiking patterns also modulated by the sniff cycle. Animals;Computational Biology;Computer Simulation;Feedback, Physiological;Male;Models, Neurological;Nerve Net;Neuronal Plasticity;Neurons;Odors;Olfactory Bulb;Rats;Synapses Multivariate autoregressive modeling and granger causality analysis of multiple spike trains. Computational intelligence and neuroscience Recent years have seen the emergence of microelectrode arrays and optical methods allowing simultaneous recording of spiking activity from populations of neurons in various parts of the nervous system. The analysis of multiple neural spike train data could benefit significantly from existing methods for multivariate time-series analysis which have proven to be very powerful in the modeling and analysis of continuous neural signals like EEG signals. However, those methods have not generally been well adapted to point processes. Here, we use our recent results on correlation distortions in multivariate Linear-Nonlinear-Poisson spiking neuron models to derive generalized Yule-Walker-type equations for fitting ''hidden" Multivariate Autoregressive models. We use this new framework to perform Granger causality analysis in order to extract the directed information flow pattern in networks of simulated spiking neurons. We discuss the relative merits and limitations of the new method. Action Potentials;Algorithms;Animals;Automation;Computer Simulation;Linear Models;Models, Neurological;Models, Statistical;Multivariate Analysis;Neural Pathways;Neurons;Nonlinear Dynamics;Normal Distribution;Poisson Distribution;Regression Analysis;Signal Processing, Computer-Assisted Chemical glycobiology: why now? Nature Chemical Biology Understanding the structure and function of carbohydrates remains a key challenge for chemical biologists. Developments in carbohydrate synthesis and analysis together with the advent of high-throughput methods such as carbohydrate microarrays have helped shed light on the function of glycoconjugates. Similarly, consortia have provided technology platforms and focus to a burgeoning field. Now, recruitment of scientists from related fields and further integration of chemistry and biology to achieve technical goals are needed for rapid advancements. A generic framework for real-time multi-channel neuronal signal analysis, telemetry control, and sub-millisecond latency feedback generation. Frontiers in neuroscience Distinct modules of the neural circuitry interact with each other and (through the motor-sensory loop) with the environment, forming a complex dynamic system. Neuro-prosthetic devices seeking to modulate or restore CNS function need to interact with the information flow at the level of neural modules electrically, bi-directionally and in real-time. A set of freely available generic tools is presented that allow computationally demanding multi-channel short-latency bi-directional interactions to be realized in in vivo and in vitro preparations using standard PC data acquisition and processing hardware and software (Mathworks Matlab and Simulink). A commercially available 60-channel extracellular multi-electrode recording and stimulation set-up connected to an ex vivo developing cortical neuronal culture is used as a model system to validate the method. We demonstrate how complex high-bandwidth (>10 MBit/s) neural recording data can be analyzed in real-time while simultaneously generating specific complex electrical stimulation feedback with deterministically timed responses at sub-millisecond resolution. Broaden the discussion Nature Nanotechnology To the Editor We agree with your Editorial 'Join the dialogue' ( Nature Nanotech. 7 , 545 ; 2012 ) that there is a need for guidelines on materials characterization requirements when reporting nanotoxicology research. However, defining a minimum set of requirements will not substitute for a rigorous peer-review process because materials characterization per se does not necessarily mean that measurements have been performed using state-of-the-art methods, or that the parameters are directly associated with the observed biological effects. The relationship between respiration-related membrane potential slow oscillations and discharge patterns in mitral/tufted cells: what are the rules? PloS one BACKGROUND: A slow respiration-related rhythm strongly shapes the activity of the olfactory bulb. This rhythm appears as a slow oscillation that is detectable in the membrane potential, the respiration-related spike discharge of the mitral/tufted cells and the bulbar local field potential. Here, we investigated the rules that govern the manifestation of membrane potential slow oscillations (MPSOs) and respiration-related discharge activities under various afferent input conditions and cellular excitability states. METHODOLOGY AND PRINCIPAL FINDINGS: We recorded the intracellular membrane potential signals in the mitral/tufted cells of freely breathing anesthetized rats. We first demonstrated the existence of multiple types of MPSOs, which were influenced by odor stimulation and discharge activity patterns. Complementary studies using changes in the intracellular excitability state and a computational model of the mitral cell demonstrated that slow oscillations in the mitral/tufted cell membrane potential were also modulated by the intracellular excitability state, whereas the respiration-related spike activity primarily reflected the afferent input. Based on our data regarding MPSOs and spike patterns, we found that cells exhibiting an unsynchronized discharge pattern never exhibited an MPSO. In contrast, cells with a respiration-synchronized discharge pattern always exhibited an MPSO. In addition, we demonstrated that the association between spike patterns and MPSO types appeared complex. CONCLUSION: We propose that both the intracellular excitability state and input strength underlie specific MPSOs, which, in turn, constrain the types of spike patterns exhibited. Animals;Intracellular Space;Kinetics;Male;Membrane Potentials;Odors;Olfactory Bulb;Periodicity;Rats;Rats, Wistar;Respiration Frequency dependence of signal power and spatial reach of the local field potential. PLoS computational biology Despite its century-old use, the interpretation of local field potentials (LFPs), the low-frequency part of electrical signals recorded in the brain, is still debated. In cortex the LFP appears to mainly stem from transmembrane neuronal currents following synaptic input, and obvious questions regarding the 'locality' of the LFP are: What is the size of the signal-generating region, i.e., the spatial reach, around a recording contact? How far does the LFP signal extend outside a synaptically activated neuronal population? And how do the answers depend on the temporal frequency of the LFP signal? Experimental inquiries have given conflicting results, and we here pursue a modeling approach based on a well-established biophysical forward-modeling scheme incorporating detailed reconstructed neuronal morphologies in precise calculations of population LFPs including thousands of neurons. The two key factors determining the frequency dependence of LFP are the spatial decay of the single-neuron LFP contribution and the conversion of synaptic input correlations into correlations between single-neuron LFP contributions. Both factors are seen to give low-pass filtering of the LFP signal power. For uncorrelated input only the first factor is relevant, and here a modest reduction (<50%) in the spatial reach is observed for higher frequencies (>100 Hz) compared to the near-DC ([Formula: see text]) value of about [Formula: see text]. Much larger frequency-dependent effects are seen when populations of pyramidal neurons receive correlated and spatially asymmetric inputs: the low-frequency ([Formula: see text]) LFP power can here be an order of magnitude or more larger than at 60 Hz. Moreover, the low-frequency LFP components have larger spatial reach and extend further outside the active population than high-frequency components. Further, the spatial LFP profiles for such populations typically span the full vertical extent of the dendrites of neurons in the population. Our numerical findings are backed up by an intuitive simplified model for the generation of population LFP. Action Potentials;Brain;Models, Neurological Insertional mutagenesis identifies multiple networks of cooperating genes driving intestinal tumorigenesis Nature Genetics The evolution of colorectal cancer suggests the involvement of many genes. To identify new drivers of intestinal cancer, we performed insertional mutagenesis using the Sleeping Beauty transposon system in mice carrying germline or somatic Apc mutations. By analyzing common insertion sites (CISs) isolated from 446 tumors, we identified many hundreds of candidate cancer drivers. Comparison to human data sets suggested that 234 CIS-targeted genes are also dysregulated in human colorectal cancers. In addition, we found 183 CIS-containing genes that are candidate Wnt targets and showed that 20 CISs-containing genes are newly discovered modifiers of canonical Wnt signaling. We also identified mutations associated with a subset of tumors containing an expanded number of Paneth cells, a hallmark of deregulated Wnt signaling, and genes associated with more severe dysplasia included those encoding members of the FGF signaling cascade. Some 70 genes had co-occurrence of CIS pairs, clustering into 38 sub-networks that may regulate tumor development. Improved focalization of electrical microstimulation using microelectrode arrays: a modeling study. PloS one Extracellular electrical stimulation (EES) of the central nervous system (CNS) has been used empirically for decades, with both fundamental and clinical goals. Currently, microelectrode arrays (MEAs) offer new possibilities for CNS microstimulation. However, although focal CNS activation is of critical importance to achieve efficient stimulation strategies, the precise spatial extent of EES remains poorly understood. The aim of the present work is twofold. First, we validate a finite element model to compute accurately the electrical potential field generated throughout the extracellular medium by an EES delivered with MEAs. This model uses Robin boundary conditions that take into account the surface conductance of electrode/medium interfaces. Using this model, we determine how the potential field is influenced by the stimulation and ground electrode impedances, and by the electrical conductivity of the neural tissue. We confirm that current-controlled stimulations should be preferred to voltage-controlled stimulations in order to control the amplitude of the potential field. Second, we evaluate the focality of the potential field and threshold-distance curves for different electrode configurations. We propose a new configuration to improve the focality, using a ground surface surrounding all the electrodes of the array. We show that the lower the impedance of this surface, the more focal the stimulation. In conclusion, this study proposes new boundary conditions for the design of precise computational models of extracellular stimulation, and a new electrode configuration that can be easily incorporated into future MEA devices, either in vitro or in vivo, for a better spatial control of CNS microstimulation. Electric Stimulation;Finite Element Analysis;Microelectrodes;Models, Theoretical Passive dendrites enable single neurons to compute linearly non-separable functions. PLoS computational biology Local supra-linear summation of excitatory inputs occurring in pyramidal cell dendrites, the so-called dendritic spikes, results in independent spiking dendritic sub-units, which turn pyramidal neurons into two-layer neural networks capable of computing linearly non-separable functions, such as the exclusive OR. Other neuron classes, such as interneurons, may possess only a few independent dendritic sub-units, or only passive dendrites where input summation is purely sub-linear, and where dendritic sub-units are only saturating. To determine if such neurons can also compute linearly non-separable functions, we enumerate, for a given parameter range, the Boolean functions implementable by a binary neuron model with a linear sub-unit and either a single spiking or a saturating dendritic sub-unit. We then analytically generalize these numerical results to an arbitrary number of non-linear sub-units. First, we show that a single non-linear dendritic sub-unit, in addition to the somatic non-linearity, is sufficient to compute linearly non-separable functions. Second, we analytically prove that, with a sufficient number of saturating dendritic sub-units, a neuron can compute all functions computable with purely excitatory inputs. Third, we show that these linearly non-separable functions can be implemented with at least two strategies: one where a dendritic sub-unit is sufficient to trigger a somatic spike; another where somatic spiking requires the cooperation of multiple dendritic sub-units. We formally prove that implementing the latter architecture is possible with both types of dendritic sub-units whereas the former is only possible with spiking dendrites. Finally, we show how linearly non-separable functions can be computed by a generic two-compartment biophysical model and a realistic neuron model of the cerebellar stellate cell interneuron. Taken together our results demonstrate that passive dendrites are sufficient to enable neurons to compute linearly non-separable functions. Action Potentials;Animals;Cerebellum;Computational Biology;Dendrites;Linear Models;Mice;Models, Neurological;Neurons Non-topographical contrast enhancement in the olfactory bulb. BMC neuroscience BACKGROUND: Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed high-dimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the two-dimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. RESULTS: We describe a novel neural circuit mechanism, non-topographical contrast enhancement (NTCE), which enables contrast enhancement among high-dimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a built-in foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. CONCLUSION: Non-topographical contrast enhancement demonstrates how intrinsically high-dimensional sensory data can be represented and processed within a physically two-dimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, high-dimensional similarity space defined by the olfactory receptor complement and underlies the concentration-independence of odor quality representations. Algorithms;Computer Simulation;Discrimination (Psychology);Models, Neurological;Neurons, Afferent;Odors;Olfactory Bulb;Olfactory Receptor Neurons An online spike detection and spike classification algorithm capable of instantaneous resolution of overlapping spikes. Journal of computational neuroscience For the analysis of neuronal cooperativity, simultaneously recorded extracellular signals from neighboring neurons need to be sorted reliably by a spike sorting method. Many algorithms have been developed to this end, however, to date, none of them manages to fulfill a set of demanding requirements. In particular, it is desirable to have an algorithm that operates online, detects and classifies overlapping spikes in real time, and that adapts to non-stationary data. Here, we present a combined spike detection and classification algorithm, which explicitly addresses these issues. Our approach makes use of linear filters to find a new representation of the data and to optimally enhance the signal-to-noise ratio. We introduce a method called "Deconfusion" which de-correlates the filter outputs and provides source separation. Finally, a set of well-defined thresholds is applied and leads to simultaneous spike detection and spike classification. By incorporating a direct feedback, the algorithm adapts to non-stationary data and is, therefore, well suited for acute recordings. We evaluate our method on simulated and experimental data, including simultaneous intra/extra-cellular recordings made in slices of a rat cortex and recordings from the prefrontal cortex of awake behaving macaques. We compare the results to existing spike detection as well as spike sorting methods. We conclude that our algorithm meets all of the mentioned requirements and outperforms other methods under realistic signal-to-noise ratios and in the presence of overlapping spikes. Action Potentials;Adaptation, Biological;Algorithms;Animals;Animals, Newborn;Databases, Factual;Models, Neurological;Neurons;Noise;Online Systems;ROC Curve;Rats;Rats, Long-Evans;Signal Processing, Computer-Assisted Mutations in the pleckstrin homology domain of dynamin 2 cause dominant intermediate Charcot-Marie-Tooth disease Nature Genetics Charcot-Marie-Tooth (CMT) disease is a clinically and genetically heterogeneous group of peripheral neuropathies. Different chromosomal loci have been linked with three autosomal dominant, 'intermediate' types of CMT: DI-CMTA, DI-CMTB and DI-CMTC. We refined the locus associated with DI-CMTB on chromosome 19p12–13.2 to 4.2 Mb in three unrelated families with CMT originating from Australia, Belgium and North America. After screening candidate genes, we identified unique mutations in dynamin 2 (DNM2) in all families. DNM2 belongs to the family of large GTPases and is part of the cellular fusion-fission apparatus. In transiently transfected cell lines, mutations of DNM2 substantially diminish binding of DNM2 to membranes by altering the conformation of the β3/β4 loop of the pleckstrin homology domain. Additionally, in the Australian and Belgian pedigrees, which carry two different mutations affecting the same amino acid, Lys558, CMT cosegregated with neutropenia, which has not previously been associated with CMT neuropathies. How well do the substrates KISS the enzyme? Molecular docking program selection for feruloyl esterases Scientific Reports Molecular docking is the most commonly used technique in the modern drug discovery process where computational approaches involving docking algorithms are used to dock small molecules into macromolecular target structures. Over the recent years several evaluation studies have been reported by independent scientists comparing the performance of the docking programs by using default ‘black box’ protocols supplied by the software companies. Such studies have to be considered carefully as the docking programs can be tweaked towards optimum performance by selecting the parameters suitable for the target of interest. In this study we address the problem of selecting an appropriate docking and scoring function combination (88 docking algorithm-scoring functions) for substrate specificity predictions for feruloyl esterases, an industrially relevant enzyme family. We also propose the ‘Key Interaction Score System’ (KISS), a more biochemically meaningful measure for evaluation of docking programs based on pose prediction accuracy. Teaching basic principles of neuroscience with computer simulations. Journal of undergraduate neuroscience education : JUNE : a publication of FUN, Faculty for Undergraduate Neuroscience It is generally believed that students learn best through activities that require their direct participation. By using simulations as a tool for learning neuroscience, students are directly engaged in the activity and obtain immediate feedback and reinforcement. This paper describes a series of biophysical models and computer simulations that can be used by educators and students to explore a variety of basic principles in neuroscience. The paper also suggests 'virtual laboratory' exercises that students may conduct to further examine biophysical processes underlying neural function. First, the Hodgkin and Huxley (HH) model is presented. The HH model is used to illustrate the action potential, threshold phenomena, and nonlinear dynamical properties of neurons (e.g., oscillations, postinhibitory rebound excitation). Second, the Morris-Lecar (ML) model is presented. The ML model is used to develop a model of a bursting neuron and to illustrate modulation of neuronal activity by intracellular ions. Lastly, principles of synaptic transmission are presented in small neural networks, which illustrate oscillatory behavior, excitatory and inhibitory postsynaptic potentials, and temporal summation. Induction and modulation of persistent activity in a layer V PFC microcircuit model. Frontiers in neural circuits Working memory refers to the temporary storage of information and is strongly associated with the prefrontal cortex (PFC). Persistent activity of cortical neurons, namely the activity that persists beyond the stimulus presentation, is considered the cellular correlate of working memory. Although past studies suggested that this type of activity is characteristic of large scale networks, recent experimental evidence imply that small, tightly interconnected clusters of neurons in the cortex may support similar functionalities. However, very little is known about the biophysical mechanisms giving rise to persistent activity in small-sized microcircuits in the PFC. Here, we present a detailed biophysically-yet morphologically simplified-microcircuit model of layer V PFC neurons that incorporates connectivity constraints and is validated against a multitude of experimental data. We show that (a) a small-sized network can exhibit persistent activity under realistic stimulus conditions. (b) Its emergence depends strongly on the interplay of dADP, NMDA, and GABAB currents. (c) Although increases in stimulus duration increase the probability of persistent activity induction, variability in the stimulus firing frequency does not consistently influence it. (d) Modulation of ionic conductances (I h , I D , I sAHP, I caL, I caN, I caR) differentially controls persistent activity properties in a location dependent manner. These findings suggest that modulation of the microcircuit's firing characteristics is achieved primarily through changes in its intrinsic mechanism makeup, supporting the hypothesis of multiple bi-stable units in the PFC. Overall, the model generates a number of experimentally testable predictions that may lead to a better understanding of the biophysical mechanisms of persistent activity induction and modulation in the PFC. Action Potentials;Computer Simulation;Humans;Memory, Short-Term;Models, Neurological;Nerve Net;Neurons;Prefrontal Cortex;Synapses Neuroinformatics: from bioinformatics to databasing the brain. Bioinformatics and biology insights Neuroinformatics seeks to create and maintain web-accessible databases of experimental and computational data, together with innovative software tools, essential for understanding the nervous system in its normal function and in neurological disorders. Neuroinformatics includes traditional bioinformatics of gene and protein sequences in the brain; atlases of brain anatomy and localization of genes and proteins; imaging of brain cells; brain imaging by positron emission tomography (PET), functional magnetic resonance imaging (fMRI), electroencephalography (EEG), magnetoencephalography (MEG) and other methods; many electrophysiological recording methods; and clinical neurological data, among others. Building neuroinformatics databases and tools presents difficult challenges because they span a wide range of spatial scales and types of data stored and analyzed. Traditional bioinformatics, by comparison, focuses primarily on genomic and proteomic data (which of course also presents difficult challenges). Much of bioinformatics analysis focus on sequences (DNA, RNA, and protein molecules), as the type of data that are stored, compared, and sometimes modeled. Bioinformatics is undergoing explosive growth with the addition, for example, of databases that catalog interactions between proteins, of databases that track the evolution of genes, and of systems biology databases which contain models of all aspects of organisms. This commentary briefly reviews neuroinformatics with clarification of its relationship to traditional and modern bioinformatics. Which elements of the mammalian central nervous system are excited by low current stimulation with microelectrodes? Neuroscience Low current cortex stimulation produces a sparse and distributed set of activated cells often with distances of several hundred micrometers between cell bodies and the microelectrode. A modeling study based on recently measured densities of high threshold sodium channels Nav1.2 in dendrites and soma and low threshold sodium channels Nav1.6 in the axon shall identify spike initiation sites including a discussion on dendritic spikes. Varying excitability along the neural axis has been observed while studying different electrode positions and configurations. Although the axon initial segment (AIS) and nodes of Ranvier are most excitable, many thin axons and dendrites which are likely to be close to the electrode in the densely packed cortical regions are also proper candidates for spike initiation sites. Cathodic threshold ratio for thin axons and dendrites is about 1:3, whereas 0.2 mum diameter axons passing the electrode tip in 10 mum distance can be activated by 100 mus pulses with 2.6 muA. Direct cathodic excitation of dendrites requires a minimum electrode-fiber distance, which increases with dendrite diameter. Therefore thin dendrites can profit from the stronger electrical field close to the electrode but low current stimulation cannot activate large diameter dendrites, contrary to the inverse recruitment order known from peripheral nerve stimulation. When local depolarization fails to generate a dendritic spike, stimulation is possible via intracellular current flow that initiates an action potential, for example 200 mum distant in the low threshold AIS or in certain cases at the distal dendrite ending. Beside these exceptions, spike initiation site for cathodic low current stimulation appears rather close to the electrode. Action Potentials;Animals;Axons;Central Nervous System;Cerebral Cortex;Dendrites;Electric Stimulation;Humans;Mammals;Microelectrodes;Models, Neurological;Neurons Neural dynamics during anoxia and the "wave of death". PloS one Recent experiments in rats have shown the occurrence of a high amplitude slow brain wave in the EEG approximately 1 minute after decapitation, with a duration of 5-15 s (van Rijn et al, PLoS One 6, e16514, 2011) that was presumed to signify the death of brain neurons. We present a computational model of a single neuron and its intra- and extracellular ion concentrations, which shows the physiological mechanism for this observation. The wave is caused by membrane potential oscillations, that occur after the cessation of activity of the sodium-potassium pumps has lead to an excess of extracellular potassium. These oscillations can be described by the Hodgkin-Huxley equations for the sodium and potassium channels, and result in a sudden change in mean membrane voltage. In combination with a high-pass filter, this sudden depolarization leads to a wave in the EEG. We discuss that this process is not necessarily irreversible. Animals;Anoxia;Brain;Humans;Models, Neurological;Models, Theoretical;Neurons;Potassium Channels;Rats;Sodium Channels Artificial intelligence for drug design Nature Biotechnology Combining rational and irrational approaches to bring drug design to the desktop. Spiking neural network simulation: numerical integration with the Parker-Sochacki method. Journal of computational neuroscience Mathematical neuronal models are normally expressed using differential equations. The Parker-Sochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the Parker-Sochacki method to the Izhikevich 'simple' model and a Hodgkin-Huxley type neuron, comparing the results with those obtained using the Runge-Kutta and Bulirsch-Stoer methods. Benchmark simulations demonstrate an improved speed/accuracy trade-off for the method relative to these established techniques. Action Potentials;Algorithms;Animals;Computer Simulation;Membrane Potentials;Models, Neurological;Neural Networks (Computer);Neurons;Time Factors Complementary positional proteomics for screening substrates of endo- and exoproteases Nature Methods We describe a positional proteomics approach to simultaneously analyze N- and C-terminal peptides and used it to screen for human protein substrates of granzyme B and carboxypeptidase A4 in human cell lysates. This approach allowed comprehensive proteome studies, and we report the identification of 965 database-annotated protein C termini, 334 neo–C termini resulting from granzyme B processing and 16 neo–C termini resulting from carboxypeptidase A4 processing. Information please Nature Structural Biology Structural biologists—perhaps more than adherents to any other biological subdiscipline—rely on computer databases and research tools to keep track of what the community already knows and to organize that information in a meaningful way. One who plunders the trove of known biological structures for enlightening similarities can make as good a living as the one who actually generated the results in the first place. Towards reproducible descriptions of neuronal network models. PLoS computational biology Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing--and thinking about--complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain. Animals;Brain;Computational Biology;Humans;Membrane Potentials;Models, Neurological;Nerve Net;Neurons;Neurosciences;Reproducibility of Results;Synapses Reciprocal inhibition and slow calcium decay in perigeniculate interneurons explain changes of spontaneous firing of thalamic cells caused by cortical inactivation. Journal of computational neuroscience The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamo-cortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retino-geniculo-cortical system. All models were robust against changes of axonal delays except for the delay between the LGN feed-forward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamo-cortical relay cells while the thalamic feed-forward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Action Potentials;Animals;Calcium;Cerebral Cortex;Computer Simulation;Feedback, Physiological;Geniculate Bodies;Interneurons;Models, Neurological;Neural Inhibition;Neural Pathways;Synapses Spike timing regulation on the millisecond scale by distributed synaptic plasticity at the cerebellum input stage: a simulation study. Frontiers in computational neuroscience The way long-term synaptic plasticity regulates neuronal spike patterns is not completely understood. This issue is especially relevant for the cerebellum, which is endowed with several forms of long-term synaptic plasticity and has been predicted to operate as a timing and a learning machine. Here we have used a computational model to simulate the impact of multiple distributed synaptic weights in the cerebellar granular-layer network. In response to mossy fiber (MF) bursts, synaptic weights at multiple connections played a crucial role to regulate spike number and positioning in granule cells. The weight at MF to granule cell synapses regulated the delay of the first spike and the weight at MF and parallel fiber to Golgi cell synapses regulated the duration of the time-window during which the first-spike could be emitted. Moreover, the weights of synapses controlling Golgi cell activation regulated the intensity of granule cell inhibition and therefore the number of spikes that could be emitted. First-spike timing was regulated with millisecond precision and the number of spikes ranged from zero to three. Interestingly, different combinations of synaptic weights optimized either first-spike timing precision or spike number, efficiently controlling transmission and filtering properties. These results predict that distributed synaptic plasticity regulates the emission of quasi-digital spike patterns on the millisecond time-scale and allows the cerebellar granular layer to flexibly control burst transmission along the MF pathway. Web Resources for Mice in Drug Discovery Lab Animal The internet contains a wealth of information on the use of mice as models for drug discovery. The author lists and describes some of these useful sites. Discriminating Vital Tumor from Necrotic Tissue in Human Glioblastoma Tissue Samples by Raman Spectroscopy Laboratory Investigation Vital and necrotic glioblastoma tissues were studied by Raman microspectroscopy to identify possibilities for the development of an in vivo Raman method for real-time intraoperative brain biopsy guidance. The histologic malignancy grade of gliomas depends on the presence of parameters such as endothelial proliferation and necrosis, which are often not evenly distributed within the tumor. Because tissue samples obtained by stereotactic surgery are relatively small, sampling errors may easily occur by missing these crucial features. Although necrosis is important for grading, specimens containing only necrosis are diagnostically useless. Raman microspectroscopic mapping experiments were performed on unfixed cryosections of glioblastoma, obtained from 20 patients. After spectral acquisition, a clustering analysis was performed, resulting in groups of similar spectra. Each cluster was assigned a color, and pseudo-color Raman maps of the tissue sections were constructed. After the Raman experiments, the tissue sections were stained for histopathologic analysis, enabling identification of the histologic origin of the Raman spectra and assignment of the Raman spectral clusters to either vital or necrotic tissue. A classification model for discrimination between vital and necrotic tumor tissue based on linear discriminant analysis was developed. The classification model was evaluated using independent Raman data obtained from nine other tissue sections and yielded 100% accuracy. Information about the biochemical differences between necrosis and vital tumor was obtained by the analysis of difference spectra. Necrotic tissue was found to consistently contain higher levels of cholesterol (-esters). This in vitro result indicates that Raman spectra contain the information to distinguish vital glioblastoma from necrosis and makes Raman spectroscopy a powerful candidate for guidance of stereotactic brain biopsy. Brian: a simulator for spiking neural networks in python. Frontiers in neuroinformatics "Brian" is a new simulator for spiking neural networks, written in Python (http://brian. di.ens.fr). It is an intuitive and highly flexible tool for rapidly developing new models, especially networks of single-compartment neurons. In addition to using standard types of neuron models, users can define models by writing arbitrary differential equations in ordinary mathematical notation. Python scientific libraries can also be used for defining models and analysing data. Vectorisation techniques allow efficient simulations despite the overheads of an interpreted language. Brian will be especially valuable for working on non-standard neuron models not easily covered by existing software, and as an alternative to using Matlab or C for simulations. With its easy and intuitive syntax, Brian is also very well suited for teaching computational neuroscience. Reinforcement learning of targeted movement in a spiking neuronal model of motor cortex. PloS one Sensorimotor control has traditionally been considered from a control theory perspective, without relation to neurobiology. In contrast, here we utilized a spiking-neuron model of motor cortex and trained it to perform a simple movement task, which consisted of rotating a single-joint "forearm" to a target. Learning was based on a reinforcement mechanism analogous to that of the dopamine system. This provided a global reward or punishment signal in response to decreasing or increasing distance from hand to target, respectively. Output was partially driven by Poisson motor babbling, creating stochastic movements that could then be shaped by learning. The virtual forearm consisted of a single segment rotated around an elbow joint, controlled by flexor and extensor muscles. The model consisted of 144 excitatory and 64 inhibitory event-based neurons, each with AMPA, NMDA, and GABA synapses. Proprioceptive cell input to this model encoded the 2 muscle lengths. Plasticity was only enabled in feedforward connections between input and output excitatory units, using spike-timing-dependent eligibility traces for synaptic credit or blame assignment. Learning resulted from a global 3-valued signal: reward (+1), no learning (0), or punishment (-1), corresponding to phasic increases, lack of change, or phasic decreases of dopaminergic cell firing, respectively. Successful learning only occurred when both reward and punishment were enabled. In this case, 5 target angles were learned successfully within 180 s of simulation time, with a median error of 8 degrees. Motor babbling allowed exploratory learning, but decreased the stability of the learned behavior, since the hand continued moving after reaching the target. Our model demonstrated that a global reinforcement signal, coupled with eligibility traces for synaptic plasticity, can train a spiking sensorimotor network to perform goal-directed motor behavior. Action Potentials;Animals;Computer Simulation;Dopamine;Humans;Models, Neurological;Motor Cortex;Movement;Neural Networks (Computer);Neuronal Plasticity;Neurons;Reinforcement (Psychology);Reward;Stochastic Processes;Synapses;Synaptic Transmission The periglomerular cell of the olfactory bulb and its role in controlling mitral cell spiking: a computational model. PloS one Interneurons in the olfactory bulb are key elements of odor processing but their roles have not yet being fully understood. Two types of inhibitory interneurons, periglomerular and granule cells, act at two different levels within the olfactory bulb and may have different roles in coordinating the spiking of mitral cells, which are the principal output neurons of the olfactory bulb. In this work we introduce a reduced compartmental model of the periglomerular cell and use it to investigate its role on mitral cell spiking in a model of an elementary cell triad composed of these two cell types plus a granule cell. Our simulation results show that the periglomerular cell is more effective in inhibiting the mitral cell than the granule cell. Based on our results we predict that periglomerular and granule cells have different roles in the control of mitral cell spiking. The periglomerular cell would be the only one capable of completely inhibiting the mitral cell, and the activity decrease of the mitral cell through this inhibitory action would occur in a stepwise fashion depending on parameters of the periglomerular and granule cells as well as on the relative times of arrival of external stimuli to the three cells. The major role of the granule cell would be to facilitate the inhibitory action of the periglomerular cell by enlarging the range of parameters of the periglomerular cell which correspond to complete inhibition of the mitral cell. The combined action of the two interneurons would thus provide an efficient way of controling the instantaneous value of the firing rate of the mitral cell. Animals;Computer Simulation;Electrophysiology;Evoked Potentials;Interneurons;Olfactory Bulb;Olfactory Nerve;Rats The ion channel inverse problem: neuroinformatics meets biophysics. PLoS computational biology Ion channels are the building blocks of the information processing capability of neurons: any realistic computational model of a neuron must include reliable and effective ion channel components. Sophisticated statistical and computational tools have been developed to study the ion channel structure-function relationship, but this work is rarely incorporated into the models used for single neurons or small networks. The disjunction is partly a matter of convention. Structure-function studies typically use a single Markov model for the whole channel whereas until recently whole-cell modeling software has focused on serial, independent, two-state subunits that can be represented by the Hodgkin-Huxley equations. More fundamentally, there is a difference in purpose that prevents models being easily reused. Biophysical models are typically developed to study one particular aspect of channel gating in detail, whereas neural modelers require broad coverage of the entire range of channel behavior that is often best achieved with approximate representations that omit structural features that cannot be adequately constrained. To bridge the gap so that more recent channel data can be used in neural models requires new computational infrastructure for bringing together diverse sources of data to arrive at best-fit models for whole-cell modeling. We review the current state of channel modeling and explore the developments needed for its conclusions to be integrated into whole-cell modeling. Biophysics;Cell Physiological Phenomena;Computational Biology;Computer Simulation;Ion Channel Gating;Ion Channels;Models, Biological;Models, Chemical;Neurosciences Structure-dynamics relationships in bursting neuronal networks revealed using a prediction framework. PloS one The question of how the structure of a neuronal network affects its functionality has gained a lot of attention in neuroscience. However, the vast majority of the studies on structure-dynamics relationships consider few types of network structures and assess limited numbers of structural measures. In this in silico study, we employ a wide diversity of network topologies and search among many possibilities the aspects of structure that have the greatest effect on the network excitability. The network activity is simulated using two point-neuron models, where the neurons are activated by noisy fluctuation of the membrane potential and their connections are described by chemical synapse models, and statistics on the number and quality of the emergent network bursts are collected for each network type. We apply a prediction framework to the obtained data in order to find out the most relevant aspects of network structure. In this framework, predictors that use different sets of graph-theoretic measures are trained to estimate the activity properties, such as burst count or burst length, of the networks. The performances of these predictors are compared with each other. We show that the best performance in prediction of activity properties for networks with sharp in-degree distribution is obtained when the prediction is based on clustering coefficient. By contrast, for networks with broad in-degree distribution, the maximum eigenvalue of the connectivity graph gives the most accurate prediction. The results shown for small ([Formula: see text]) networks hold with few exceptions when different neuron models, different choices of neuron population and different average degrees are applied. We confirm our conclusions using larger ([Formula: see text]) networks as well. Our findings reveal the relevance of different aspects of network structure from the viewpoint of network excitability, and our integrative method could serve as a general framework for structure-dynamics studies in biosciences. Algorithms;Animals;Cluster Analysis;Computer Simulation;Humans;Models, Neurological;Nerve Net;Neural Networks (Computer);Neurons;Synapses Toward a full-scale computational model of the rat dentate gyrus. Frontiers in neural circuits Recent advances in parallel computing, including the creation of the parallel version of the NEURON simulation environment, have allowed for a previously unattainable level of complexity and detail in neural network models. Previously, we published a functional NEURON model of the rat dentate gyrus with over 50,000 biophysically realistic, multicompartmental neurons, but network simulations could only utilize a single processor. By converting the model to take advantage of parallel NEURON, we are now able to utilize greater computational resources and are able to simulate the full-scale dentate gyrus, containing over a million neurons. This has eliminated the previous necessity for scaling adjustments and allowed for a more direct comparison to experimental techniques and results. The translation to parallel computing has provided a superlinear speedup of computation time and dramatically increased the overall computer memory available to the model. The incorporation of additional computational resources has allowed for more detail and elements to be included in the model, bringing the model closer to a more complete and accurate representation of the biological dentate gyrus. As an example of a major step toward an increasingly accurate representation of the biological dentate gyrus, we discuss the incorporation of realistic granule cell dendrites into the model. Our previous model contained simplified, two-dimensional dendritic morphologies that were identical for neurons of the same class. Using the software tools L-Neuron and L-Measure, we are able to introduce cell-to-cell variability by generating detailed, three-dimensional granule cell morphologies that are based on biological reconstructions. Through these and other improvements, we aim to construct a more complete full-scale model of the rat dentate gyrus, to provide a better tool to delineate the functional role of cell types within the dentate gyrus and their pathological changes observed in epilepsy. Mitral cell spike synchrony modulated by dendrodendritic synapse location. Frontiers in computational neuroscience On their long lateral dendrites, mitral cells of the olfactory bulb form dendrodendritic synapses with large populations of granule cell interneurons. The mitral-granule cell microcircuit operating through these reciprocal synapses has been implicated in inducing synchrony between mitral cells. However, the specific mechanisms of mitral cell synchrony operating through this microcircuit are largely unknown and are complicated by the finding that distal inhibition on the lateral dendrites does not modulate mitral cell spikes. In order to gain insight into how this circuit synchronizes mitral cells within its spatial constraints, we built on a reduced circuit model of biophysically realistic multi-compartment mitral and granule cells to explore systematically the roles of dendrodendritic synapse location and mitral cell separation on synchrony. The simulations showed that mitral cells can synchronize when separated at arbitrary distances through a shared set of granule cells, but synchrony is optimally attained when shared granule cells form two balanced subsets, each subset clustered near to a soma of the mitral cell pairs. Another constraint for synchrony is that the input magnitude must be balanced. When adjusting the input magnitude driving a particular mitral cell relative to another, the mitral-granule cell circuit served to normalize spike rates of the mitral cells while inducing a phase shift or delay in the more weakly driven cell. This shift in phase is absent when the granule cells are removed from the circuit. Our results indicate that the specific distribution of dendrodendritic synaptic clusters is critical for optimal synchronization of mitral cell spikes in response to their odor input. Input-to-output transformation in a model of the rat hippocampal CA1 network. Frontiers in computational neuroscience Here we use computational modeling to gain new insights into the transformation of inputs in hippocampal field CA1. We considered input-output transformation in CA1 principal cells of the rat hippocampus, with activity synchronized by population gamma oscillations. Prior experiments have shown that such synchronization is especially strong for cells within one millimeter of each other. We therefore simulated a one-millimeter ıt patch of CA1 with 23,500 principal cells. We used morphologically and biophysically detailed neuronal models, each with more than 1000 compartments and thousands of synaptic inputs. Inputs came from binary patterns of spiking neurons from field CA3 and entorhinal cortex (EC). On average, each presynaptic pattern initiated action potentials in the same number of CA1 principal cells in the patch. We considered pairs of similar and pairs of distinct patterns. In all the cases CA1 strongly separated input patterns. However, CA1 cells were considerably more sensitive to small alterations in EC patterns compared to CA3 patterns. Our results can be used for comparison of input-to-output transformations in normal and pathological hippocampal networks. Derivation of cable parameters for a reduced model that retains asymmetric voltage attenuation of reconstructed spinal motor neuron dendrites Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Abstract Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and stateoftheart Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semanticallyequivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Abstract Injury to neural tissue renders voltagegated Na + (Nav) channels leaky. Even mild axonal trauma initiates Na + loading, leading to secondary Ca 2+ loading and white matter degeneration. The nodal isoform is Nav1.6 and for Nav1.6expressing HEKcells, traumatic whole cell stretch causes an immediate tetrodotoxinsensitive Na + leak. In stretchdamaged oocyte patches, Nav1.6 current undergoes damageintensity dependent hyperpolarizing (left) shifts, but whether leftshift underlies injuredaxon Navleak is uncertain. Nav1.6 inactivation (availability) is kinetically limited by (coupled to) Nav activation, yielding coupled leftshift (CLS) of the two processes: CLS should move the steadystate Nav1.6 “window conductance” closer to typical firing thresholds. Here we simulated excitability and ion homeostasis in freerunning nodes of Ranvier to assess if hallmark injuredaxon behaviors—Na + loading, ectopic excitation, propagation block—would occur with NavCLS. Intact/traumatized axolemma ratios were varied, and for some simulations Na/K pumps were included, with varied in/outside volumes. We simulated saltatory propagation with one midaxon node variously traumatized. While dissipating the [Na + ] gradient and hyperactivating the Na/K pump, NavCLS generated neuropathic painlike ectopic bursts. Depending on CLS magnitude, fraction of Nav channels affected, and pump intensity, tonic or burst firing or nodal inexcitability occurred, with [Na + ] and [K + ] fluctuating. Severe CLSinduced inexcitability did not preclude Na + loading; in fact, the steadystate Na + leaks elicited large pump currents. At a midaxon node, mild CLS perturbed normal anterograde propagation, and severe CLS blocked saltatory propagation. These results suggest that in damaged excitable cells, NavCLS could initiate cellular deterioration with attendant hyper or hypoexcitability. Healthycell versions of NavCLS, however, could contribute to physiological rhythmic firing. Abstract Lateral inhibition of cells surrounding an excited area is a key property of sensory systems, sharpening the preferential tuning of individual cells in the presence of closely related input signals. In the olfactory pathway, a dendrodendritic synaptic microcircuit between mitral and granule cells in the olfactory bulb has been proposed to mediate this type of interaction through granule cell inhibition of surrounding mitral cells. However, it is becoming evident that odor inputs result in broad activation of the olfactory bulb with interactions that go beyond neighboring cells. Using a realistic modeling approach we show how backpropagating action potentials in the long lateral dendrites of mitral cells, together with granule cell actions on mitral cells within narrow columns forming glomerular units, can provide a mechanism to activate strong local inhibition between arbitrarily distant mitral cells. The simulations predict a new role for the dendrodendritic synapses in the multicolumnar organization of the granule cells. This new paradigm gives insight into the functional significance of the patterns of connectivity revealed by recent viral tracing studies. Together they suggest a functional wiring of the olfactory bulb that could greatly expand the computational roles of the mitral–granule cell network. Abstract Spinal motor neurons have voltage gated ion channels localized in their dendrites that generate plateau potentials. The physical separation of ion channels for spiking from plateau generating channels can result in nonlinear bistable firing patterns. The physical separation and geometry of the dendrites results in asymmetric coupling between dendrites and soma that has not been addressed in reduced models of nonlinear phenomena in motor neurons. We measured voltage attenuation properties of six anatomically reconstructed and typeidentified cat spinal motor neurons to characterize asymmetric coupling between the dendrites and soma. We showed that the voltage attenuation at any distance from the soma was directiondependent and could be described as a function of the input resistance at the soma. An analytical solution for the lumped cable parameters in a twocompartment model was derived based on this finding. This is the first twocompartment modeling approach that directly derived lumped cable parameters from the geometrical and passive electrical properties of anatomically reconstructed neurons. Computational modeling of cellular signaling processes embedded into dynamic spatial contexts Nature Methods Cellular signaling processes depend on spatiotemporal distributions of molecular components. Multicolor, high-resolution microscopy permits detailed assessment of such distributions, providing input for fine-grained computational models that explore mechanisms governing dynamic assembly of multimolecular complexes and their role in shaping cellular behavior. However, it is challenging to incorporate into such models both complex molecular reaction cascades and the spatial localization of signaling components in dynamic cellular morphologies. Here we introduce an approach to address these challenges by automatically generating computational representations of complex reaction networks based on simple bimolecular interaction rules embedded into detailed, adaptive models of cellular morphology. Using examples of receptor-mediated cellular adhesion and signal-induced localized mitogen-activated protein kinase (MAPK) activation in yeast, we illustrate the capacity of this simulation technique to provide insights into cell biological processes. The modeling algorithms, implemented in a new version of the Simmune toolset, are accessible through intuitive graphical interfaces and programming libraries. Intrinsic dendritic filtering gives low-pass power spectra of local field potentials Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Channelpedia: an integrative and interactive database for ion channels. Frontiers in neuroinformatics Ion channels are membrane proteins that selectively conduct ions across the cell membrane. The flux of ions through ion channels drives electrical and biochemical processes in cells and plays a critical role in shaping the electrical properties of neurons. During the past three decades, extensive research has been carried out to characterize the molecular, structural, and biophysical properties of ion channels. This research has begun to elucidate the role of ion channels in neuronal function and has subsequently led to the development of computational models of ion channel function. Although there have been substantial efforts to consolidate these findings into easily accessible and coherent online resources, a single comprehensive resource is still lacking. The success of these initiatives has been hindered by the sheer diversity of approaches and the variety in data formats. Here, we present "Channelpedia" (http://channelpedia.net), which is designed to store information related to ion channels and models and is characterized by an efficient information management framework. Composed of a combination of a database and a wiki-like discussion platform Channelpedia allows researchers to collaborate and synthesize ion channel information from literature. Equipped to automatically update references, Channelpedia integrates and highlights recent publications with relevant information in the database. It is web based, freely accessible and currently contains 187 annotated ion channels with 45 Hodgkin-Huxley models. Computational reconstruction of pacemaking and intrinsic electroresponsiveness in cerebellar Golgi cells. Frontiers in cellular neuroscience The Golgi cells have been recently shown to beat regularly in vitro (Forti et al., 2006. J. Physiol. 574, 711-729). Four main currents were shown to be involved, namely a persistent sodium current (I(Na-p)), an h current (I(h)), an SK-type calcium-dependent potassium current (I(K-AHP)), and a slow M-like potassium current (I(K-slow)). These ionic currents could take part, together with others, also to different aspects of neuronal excitability like responses to depolarizing and hyperpolarizing current injection. However, the ionic mechanisms and their interactions remained largely hypothetical. In this work, we have investigated the mechanisms of Golgi cell excitability by developing a computational model. The model predicts that pacemaking is sustained by subthreshold oscillations tightly coupled to spikes. I(Na-p) and I(K-slow) emerged as the critical determinants of oscillations. I(h) also played a role by setting the oscillatory mechanism into the appropriate membrane potential range. I(K-AHP), though taking part to the oscillation, appeared primarily involved in regulating the ISI following spikes. The combination with other currents, in particular a resurgent sodium current (I(Na-r)) and an A-current (I(K-A)), allowed a precise regulation of response frequency and delay. These results provide a coherent reconstruction of the ionic mechanisms determining Golgi cell intrinsic electroresponsiveness and suggests important implications for cerebellar signal processing, which will be fully developed in a companion paper (Solinas et al., 2008. Front. Neurosci. 2:4). Cortical information flow in Parkinson's disease: a composite network/field model. Frontiers in computational neuroscience The basal ganglia play a crucial role in the execution of movements, as demonstrated by the severe motor deficits that accompany Parkinson's disease (PD). Since motor commands originate in the cortex, an important question is how the basal ganglia influence cortical information flow, and how this influence becomes pathological in PD. To explore this, we developed a composite neuronal network/neural field model. The network model consisted of 4950 spiking neurons, divided into 15 excitatory and inhibitory cell populations in the thalamus and cortex. The field model consisted of the cortex, thalamus, striatum, subthalamic nucleus, and globus pallidus. Both models have been separately validated in previous work. Three field models were used: one with basal ganglia parameters based on data from healthy individuals, one based on data from individuals with PD, and one purely thalamocortical model. Spikes generated by these field models were then used to drive the network model. Compared to the network driven by the healthy model, the PD-driven network had lower firing rates, a shift in spectral power toward lower frequencies, and higher probability of bursting; each of these findings is consistent with empirical data on PD. In the healthy model, we found strong Granger causality between cortical layers in the beta and low gamma frequency bands, but this causality was largely absent in the PD model. In particular, the reduction in Granger causality from the main "input" layer of the cortex (layer 4) to the main "output" layer (layer 5) was pronounced. This may account for symptoms of PD that seem to reflect deficits in information flow, such as bradykinesia. In general, these results demonstrate that the brain's large-scale oscillatory environment, represented here by the field model, strongly influences the information processing that occurs within its subnetworks. Hence, it may be preferable to drive spiking network models with physiologically realistic inputs rather than pure white noise. InterMOD: integrated data and tools for the unification of model organism research Scientific Reports Model organisms are widely used for understanding basic biology, and have significantly contributed to the study of human disease. In recent years, genomic analysis has provided extensive evidence of widespread conservation of gene sequence and function amongst eukaryotes, allowing insights from model organisms to help decipher gene function in a wider range of species. The InterMOD consortium is developing an infrastructure based around the InterMine data warehouse system to integrate genomic and functional data from a number of key model organisms, leading the way to improved cross-species research. So far including budding yeast, nematode worm, fruit fly, zebrafish, rat and mouse, the project has set up data warehouses, synchronized data models, and created analysis tools and links between data from different species. The project unites a number of major model organism databases, improving both the consistency and accessibility of comparative research, to the benefit of the wider scientific community. Progressive effect of beta amyloid peptides accumulation on CA1 pyramidal neurons: a model study suggesting possible treatments. Frontiers in computational neuroscience Several independent studies show that accumulation of β-amyloid (Aβ) peptides, one of the characteristic hallmark of Alzheimer's Disease (AD), can affect normal neuronal activity in different ways. However, in spite of intense experimental work to explain the possible underlying mechanisms of action, a comprehensive and congruent understanding is still lacking. Part of the problem might be the opposite ways in which Aβ have been experimentally found to affect the normal activity of a neuron; for example, making a neuron more excitable (by reducing the A- or DR-type K(+) currents) or less excitable (by reducing synaptic transmission and Na(+) current). The overall picture is therefore confusing, since the interplay of many mechanisms makes it difficult to link individual experimental findings with the more general problem of understanding the progression of the disease. This is an important issue, especially for the development of new drugs trying to ameliorate the effects of the disease. We addressed these paradoxes through computational models. We first modeled the different stages of AD by progressively modifying the intrinsic membrane and synaptic properties of a realistic model neuron, while accounting for multiple and different experimental findings and by evaluating the contribution of each mechanism to the overall modulation of the cell's excitability. We then tested a number of manipulations of channel and synaptic activation properties that could compensate for the effects of Aβ. The model predicts possible therapeutic treatments in terms of pharmacological manipulations of channels' kinetic and activation properties. The results also suggest how and which mechanisms can be targeted by a drug to restore the original firing conditions. Comparison of neuronal spike exchange methods on a Blue Gene/P supercomputer. Frontiers in computational neuroscience For neural network simulations on parallel machines, interprocessor spike communication can be a significant portion of the total simulation time. The performance of several spike exchange methods using a Blue Gene/P (BG/P) supercomputer has been tested with 8-128 K cores using randomly connected networks of up to 32 M cells with 1 k connections per cell and 4 M cells with 10 k connections per cell, i.e., on the order of 4·10(10) connections (K is 1024, M is 1024(2), and k is 1000). The spike exchange methods used are the standard Message Passing Interface (MPI) collective, MPI_Allgather, and several variants of the non-blocking Multisend method either implemented via non-blocking MPI_Isend, or exploiting the possibility of very low overhead direct memory access (DMA) communication available on the BG/P. In all cases, the worst performing method was that using MPI_Isend due to the high overhead of initiating a spike communication. The two best performing methods-the persistent Multisend method using the Record-Replay feature of the Deep Computing Messaging Framework DCMF_Multicast; and a two-phase multisend in which a DCMF_Multicast is used to first send to a subset of phase one destination cores, which then pass it on to their subset of phase two destination cores-had similar performance with very low overhead for the initiation of spike communication. Departure from ideal scaling for the Multisend methods is almost completely due to load imbalance caused by the large variation in number of cells that fire on each processor in the interval between synchronization. Spike exchange time itself is negligible since transmission overlaps with computation and is handled by a DMA controller. We conclude that ideal performance scaling will be ultimately limited by imbalance between incoming processor spikes between synchronization intervals. Thus, counterintuitively, maximization of load balance requires that the distribution of cells on processors should not reflect neural net architecture but be randomly distributed so that sets of cells which are burst firing together should be on different processors with their targets on as large a set of processors as possible. Phase precession through acceleration of local theta rhythm: a biophysical model for the interaction between place cells and local inhibitory neurons Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Successes and Rewards in Sharing Digital Reconstructions of Neuronal Morphology Neuroinformatics Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Dominant ionic mechanisms explored in spiking and bursting using local low-dimensional reductions of a biophysically realistic model neuron Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. The Dangers of Plug-and-Play Simulation Using Shared Models Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Issues in the Design of a Pilot Concept-Based Query Interface for the Neuroinformatics Information Framework Neuroinformatics Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Tracking the Source of Quantitative Knowledge in Neuroscience: A Neuroinformatics Role for Computational Models Neuroinformatics Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. The impact of internodal segmentation in biophysical nerve fiber models Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Implementation of double cable models to simulate the behavior of myelinated peripheral nerve fibers requires defining a segmentation of the internode between successive nodes of Ranvier. The number of internodal segments is a model parameter that is not well agreed on, with values in the literature ranging from 1 to more than 500. Moreover, a lot of studies also lack a sensitivity study or a rationale behind the implementation used. In a model of a myelinated nerve fiber developed in our group, the segmentation scheme (i.e., the number of segments and their individual morphology) strongly influenced model outcomes such as action potential shape and velocity, stimulation threshold and absolute refractory period. In the present study these influences were investigated systematically in homogeneous neurons with different diameters. Uniformly segmented internodes were found to require several hundreds of segments (and associated computational power) to reach model outcomes differing by less than 1 % from the asymptotic value. In fact, in the majority of segmentation schemes the main determinant is not the number of segments, but the length λ of the internodal segments directly adjacent to the nodes of Ranvier. If λ is larger than approximately 10 μ m, model outcomes for the tested fibers are almost independent of the total number of segments. Furthermore, λ can be optimized to enable models using just three segments per internode, to reach physiologically relevant model outcomes with limited computational resources. However, to study anatomical or physiological details of the internode itself, an appropriately detailed segmentation scheme is crucial. Determining the contributions of divisive and subtractive feedback in the Hodgkin-Huxley model Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract The HodgkinHuxley (HH) model is the basis for numerous neural models. There are two negative feedback processes in the HH model that regulate rhythmic spiking. The first is an outward current with an activation variable n that has an opposite influence to the excitatory inward current and therefore provides subtractive negative feedback. The other is the inactivation of an inward current with an inactivation variable h that reduces the amount of positive feedback and therefore provides divisive feedback. Rhythmic spiking can be obtained with either negative feedback process, so we ask what is gained by having two feedback processes. We also ask how the different negative feedback processes contribute to spiking. We show that having two negative feedback processes makes the HH model more robust to changes in applied currents and conductance densities than models that possess only one negative feedback variable. We also show that the contributions made by the subtractive and divisive feedback variables are not static, but depend on time scales and conductance values. In particular, they contribute differently to the dynamics in Type I versus Type II neurons. A Hamilton-Jacobi-Bellman approach for termination of seizure-like bursting Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract The HodgkinHuxley (HH) model is the basis for numerous neural models. There are two negative feedback processes in the HH model that regulate rhythmic spiking. The first is an outward current with an activation variable n that has an opposite influence to the excitatory inward current and therefore provides subtractive negative feedback. The other is the inactivation of an inward current with an inactivation variable h that reduces the amount of positive feedback and therefore provides divisive feedback. Rhythmic spiking can be obtained with either negative feedback process, so we ask what is gained by having two feedback processes. We also ask how the different negative feedback processes contribute to spiking. We show that having two negative feedback processes makes the HH model more robust to changes in applied currents and conductance densities than models that possess only one negative feedback variable. We also show that the contributions made by the subtractive and divisive feedback variables are not static, but depend on time scales and conductance values. In particular, they contribute differently to the dynamics in Type I versus Type II neurons. Abstract We use HamiltonJacobiBellman methods to find minimumtime and energyoptimal control strategies to terminate seizurelike bursting behavior in a conductancebased neural model. Averaging is used to eliminate fast variables from the model, and a target set is defined through bifurcation analysis of the slow variables of the model. This method is illustrated for a single neuron model and for a network model to illustrate its efficacy in terminating bursting once it begins. This work represents a numerical proofofconcept that a new class of control strategies can be employed to mitigate bursting, and could ultimately be adapted to treat medically intractible epilepsy in patientspecific models. The Breast Cancer Gene Database: a collaborative information resource Oncogene The Breast Cancer Gene Database (BCGD) is a compendium of molecular genetic data relating to genes involved in breast cancer, and which is freely available via the World Wide Web. The data in BCGD is extracted from the published biomedical research literature and stored as a collection of `Facts', which in turn are collected into topical categories organized by gene. This organization facilitates quick searches and rapid retrievals of specific data such as gene characteristics, functions and role in oncogenesis, and is an important factor allowing for continuous updates. BCGD can be searched either by gene name or keyword. Data is deposited and retrieved from the database through a set of interactive Web forms, making it both platform-independent and universally accessible in facilitating worldwide collaborative authoring of the database. Data in BCGD is linked to other on-line resources such as Entrez, GeneCards and On-Line Mendelian Inheritance in Man. BCGD is located at http://mbcr.bcm.tmc.edu/ermb/bcgd/bcgd.html. A working memory model for serial order that stores information in the intrinsic excitability properties of neurons Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Abstract Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and stateoftheart Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semanticallyequivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Abstract Injury to neural tissue renders voltagegated Na + (Nav) channels leaky. Even mild axonal trauma initiates Na + loading, leading to secondary Ca 2+ loading and white matter degeneration. The nodal isoform is Nav1.6 and for Nav1.6expressing HEKcells, traumatic whole cell stretch causes an immediate tetrodotoxinsensitive Na + leak. In stretchdamaged oocyte patches, Nav1.6 current undergoes damageintensity dependent hyperpolarizing (left) shifts, but whether leftshift underlies injuredaxon Navleak is uncertain. Nav1.6 inactivation (availability) is kinetically limited by (coupled to) Nav activation, yielding coupled leftshift (CLS) of the two processes: CLS should move the steadystate Nav1.6 “window conductance” closer to typical firing thresholds. Here we simulated excitability and ion homeostasis in freerunning nodes of Ranvier to assess if hallmark injuredaxon behaviors—Na + loading, ectopic excitation, propagation block—would occur with NavCLS. Intact/traumatized axolemma ratios were varied, and for some simulations Na/K pumps were included, with varied in/outside volumes. We simulated saltatory propagation with one midaxon node variously traumatized. While dissipating the [Na + ] gradient and hyperactivating the Na/K pump, NavCLS generated neuropathic painlike ectopic bursts. Depending on CLS magnitude, fraction of Nav channels affected, and pump intensity, tonic or burst firing or nodal inexcitability occurred, with [Na + ] and [K + ] fluctuating. Severe CLSinduced inexcitability did not preclude Na + loading; in fact, the steadystate Na + leaks elicited large pump currents. At a midaxon node, mild CLS perturbed normal anterograde propagation, and severe CLS blocked saltatory propagation. These results suggest that in damaged excitable cells, NavCLS could initiate cellular deterioration with attendant hyper or hypoexcitability. Healthycell versions of NavCLS, however, could contribute to physiological rhythmic firing. Abstract Lateral inhibition of cells surrounding an excited area is a key property of sensory systems, sharpening the preferential tuning of individual cells in the presence of closely related input signals. In the olfactory pathway, a dendrodendritic synaptic microcircuit between mitral and granule cells in the olfactory bulb has been proposed to mediate this type of interaction through granule cell inhibition of surrounding mitral cells. However, it is becoming evident that odor inputs result in broad activation of the olfactory bulb with interactions that go beyond neighboring cells. Using a realistic modeling approach we show how backpropagating action potentials in the long lateral dendrites of mitral cells, together with granule cell actions on mitral cells within narrow columns forming glomerular units, can provide a mechanism to activate strong local inhibition between arbitrarily distant mitral cells. The simulations predict a new role for the dendrodendritic synapses in the multicolumnar organization of the granule cells. This new paradigm gives insight into the functional significance of the patterns of connectivity revealed by recent viral tracing studies. Together they suggest a functional wiring of the olfactory bulb that could greatly expand the computational roles of the mitral–granule cell network. Abstract Spinal motor neurons have voltage gated ion channels localized in their dendrites that generate plateau potentials. The physical separation of ion channels for spiking from plateau generating channels can result in nonlinear bistable firing patterns. The physical separation and geometry of the dendrites results in asymmetric coupling between dendrites and soma that has not been addressed in reduced models of nonlinear phenomena in motor neurons. We measured voltage attenuation properties of six anatomically reconstructed and typeidentified cat spinal motor neurons to characterize asymmetric coupling between the dendrites and soma. We showed that the voltage attenuation at any distance from the soma was directiondependent and could be described as a function of the input resistance at the soma. An analytical solution for the lumped cable parameters in a twocompartment model was derived based on this finding. This is the first twocompartment modeling approach that directly derived lumped cable parameters from the geometrical and passive electrical properties of anatomically reconstructed neurons. Abstract Models for temporary information storage in neuronal populations are dominated by mechanisms directly dependent on synaptic plasticity. There are nevertheless other mechanisms available that are well suited for creating shortterm memories. Here we present a model for working memory which relies on the modulation of the intrinsic excitability properties of neurons, instead of synaptic plasticity, to retain novel information for periods of seconds to minutes. We show that it is possible to effectively use this mechanism to store the serial order in a sequence of patterns of activity. For this we introduce a functional class of neurons, named gate interneurons, which can store information in their membrane dynamics and can literally act as gates routing the flow of activations in the principal neurons population. The presented model exhibits properties which are in close agreement with experimental results in working memory. Namely, the recall process plays an important role in stabilizing and prolonging the memory trace. This means that the stored information is correctly maintained as long as it is being used. Moreover, the working memory model is adequate for storing completely new information, in time windows compatible with the notion of “oneshot” learning (hundreds of milliseconds). Mode of Action of Hadacidin in the Growing Bacterial Cell Nature Hadacidin shows antibacterial activity against E. coli, Bacillus megaterium, B. subtilis, Xanthomonas vesicatoria and Salmonella schottmuelleris>*. Since the elimination by mutation of adenylosuccinate synthetase in bacteria results in the accumulation and the excretion of hypo-xanthine10 (formed from degradation of inosinic acid), it was reasoned that if this enzyme was the primary site of action of hadacidin in bacteria, the antimetabolite should mimic adenine auxotrophy and cause excretion of hypoxanthine when added to a wild-type culture. Protein delivery by subviral particles of human cytomegalovirus Gene Therapy Direct protein delivery is an emerging technology in vaccine development and gene therapy. We could previously show that subviral dense bodies (DB) of human cytomegalovirus (HCMV), a beta-herpesvirus, transport viral proteins into target cells by membrane fusion. Thus these non-infectious particles provide a candidate delivery system for the prophylactic and therapeutic application of proteins. Here we provide proof of principle that DB can be modified genetically. A 55 kDa fusion protein consisting of the green fluorescent protein and the neomycin phosphotransferase could be packed in and delivered into cells by recombinant DB in a functional fashion. Furthermore, transfer of protein into fibroblasts and dendritic cells by DB was efficient, leading to exogenous loading of the MHC-class I antigen presentation pathway. Thus, DB may be a promising basis for the development of novel vaccine strategies and therapeutics based on recombinant polypeptides. Disrupting the PIKE-A/Akt interaction inhibits glioblastoma cell survival, migration, invasion and colony formation Oncogene The cyclin-dependent kinase 4 (CDK4) amplicon is frequently amplified in numerous human cancers including gliomas. PIKE-A, a proto-oncogene that is one of the important components of the CDK4 amplicon, binds to and enhances the kinase activity of Akt, thereby promoting cancer progression. To define the roles of the PIKE-A/Akt interaction in glioblastoma multiform (GBM) progression, we used biochemical protein/protein interaction (PPI) assays and live cell fluorescence-based protein complementation assays to search for small peptide antagonist from these proteins that were able to block their interaction. Here, we show that disruption of the interaction between PIKE-A and Akt by the small peptides significantly reduces glioblastoma cell proliferation, colony formation, migration and invasion. Disruption of PIKE-A/Akt association potently suppressed GBM cell proliferation and sensitized the cells to two clinical drugs that are currently used to treat GBM. Interestingly, GBM cells containing the CDK4 amplicon were more responsive to the inhibition of the PIKE-A/Akt interaction than GBM cells lacking this amplicon. Taken together, our findings provide proof-of-principle that blocking a PPI that is essential for cancer progression provides a valuable strategy for therapeutic discovery. Transcriptional regulatory code of a eukaryotic genome Nature DNA-binding transcriptional regulators interpret the genome's regulatory code by binding to specific sequences to induce or repress gene expression. Comparative genomics has recently been used to identify potential cis-regulatory sequences within the yeast genome on the basis of phylogenetic conservation, but this information alone does not reveal if or when transcriptional regulators occupy these binding sites. We have constructed an initial map of yeast's transcriptional regulatory code by identifying the sequence elements that are bound by regulators under various conditions and that are conserved among Saccharomyces species. The organization of regulatory elements in promoters and the environment-dependent use of these elements by regulators are discussed. We find that environment-specific use of regulatory elements predicts mechanistic models for the function of a large population of yeast's transcriptional regulators. Identification of lysine succinylation as a new post-translational modification Nature Chemical Biology Of the 20 ribosomally coded amino acid residues, lysine is the most frequently post-translationally modified, which has important functional and regulatory consequences. Here we report the identification and verification of a previously unreported form of protein post-translational modification (PTM): lysine succinylation. The succinyllysine residue was initially identified by mass spectrometry and protein sequence alignment. The identified succinyllysine peptides derived from in vivo proteins were verified by western blot analysis, in vivo labeling with isotopic succinate, MS/MS and HPLC coelution of their synthetic counterparts. We further show that lysine succinylation is evolutionarily conserved and that this PTM responds to different physiological conditions. Our study also implies that succinyl-CoA might be a cofactor for lysine succinylation. Given the apparent high abundance of lysine succinylation and the significant structural changes induced by this PTM, it is expected that lysine succinylation has important cellular functions. Plant NBS-LRR proteins in pathogen sensing and host defense Nature Immunology Plant proteins belonging to the nucleotide-binding site–leucine-rich repeat (NBS-LRR) family are used for pathogen detection. Like the mammalian Nod-LRR protein 'sensors' that detect intracellular conserved pathogen-associated molecular patterns, plant NBS-LRR proteins detect pathogen-associated proteins, most often the effector molecules of pathogens responsible for virulence. Many virulence proteins are detected indirectly by plant NBS-LRR proteins from modifications the virulence proteins inflict on host target proteins. However, some NBS-LRR proteins directly bind pathogen proteins. Association with either a modified host protein or a pathogen protein leads to conformational changes in the amino-terminal and LRR domains of plant NBS-LRR proteins. Such conformational alterations are thought to promote the exchange of ADP for ATP by the NBS domain, which activates 'downstream' signaling, by an unknown mechanism, leading to pathogen resistance. Right Ventricular Injury in Young Swine: Effects of Catecholamines on Right Ventricular Function and Pulmonary Vascular Mechanics Pediatric Research Acute right ventricular (RV) injury is commonly encountered in infants and children after cardiac surgery. Empiric medical therapy for these patients results from a paucity of data on which to base medical management and the absence of animal models that allow rigorous laboratory testing. Specifically, exogenous catecholamines have unclear effects on the injured right ventricle and pulmonary vasculature in the young. Ten anesthetized piglets (9–12 kg) were instrumented with epicardial transducers, micromanometers, and a pulmonary artery flow probe. RV injury was induced with a cryoablation probe. Dopamine at 10 μg/kg/min, dobutamine at 10 μg/kg/min, and epinephrine (EP) at 0.1 μg/kg/min were infused in a random order. RV contractility was evaluated using preload recruitable stroke work. Diastolic function was described by the end-diastolic pressure-volume relation, peak negative derivative of the pressure waveform, and peak filling rate. In addition to routine hemodynamic measurements, Fourier transformation of the pressure and flow waveforms allowed calculation of input resistance, characteristic impedance, RV total hydraulic power, and transpulmonary vascular efficiency. Cryoablation led to a stable reproducible injury, decreased preload recruitable stroke work, and impaired diastolic function as measured by all three indices. Infusion of each catecholamine improved preload recruitable stroke work and peak negative derivative of the pressure waveform. Dobutamine and EP both decreased indices of pulmonary vascular impedance, whereas EP was the only inotrope that significantly improved transpulmonary vascular efficiency. Although all three inotropes improved systolic and diastolic RV function, only EP decreased input resistance, decreased pulmonary vascular resistance, and increased transpulmonary vascular efficiency. On-chip plasmon-induced transparency based on plasmonic coupled nanocavities Scientific Reports On-chip plasmon-induced transparency offers the possibility of realization of ultrahigh-speed information processing chips. Unfortunately, little experimental progress has been made to date because it is difficult to obtain on-chip plasmon-induced transparency using only a single meta-molecule in plasmonic circuits. Here, we report a simple and efficient strategy to realize on-chip plasmon-induced transparency in a nanoscale U-shaped plasmonic waveguide side-coupled nanocavity pair. High tunability in the transparency window is achieved by covering the pair with different organic polymer layers. It is possible to realize ultrafast all-optical tunability based on pump light-induced refractive index change of a graphene cover layer. Compared with previous reports, the overall feature size of the plasmonic nanostructure is reduced by more than three orders of magnitude, while ultrahigh tunability of the transparency window is maintained. This work also provides a superior platform for the study of the various physical effects and phenomena of nonlinear optics and quantum optics. Recycled dehydrated lithosphere observed in plume-influenced mid-ocean-ridge basalt Nature A substantial uncertainty in the Earth's global geochemical water cycle is the amount of water that enters the deep mantle through the subduction and recycling of hydrated oceanic lithosphere. Here we address the question of recycling of water into the deep mantle by characterizing the volatile contents of different mantle components as sampled by ocean island basalts and mid-ocean-ridge basalts. Although all mantle plume (ocean island) basalts seem to contain more water than mid-ocean-ridge basalts, we demonstrate that basalts associated with mantle plume components containing subducted lithosphere—‘enriched-mantle’ or ‘EM-type’ basalts—contain less water than those associated with a common mantle source. We interpret this depletion as indicating that water is extracted from the lithosphere during the subduction process, with greater than 92 per cent efficiency. A small-molecule AdipoR agonist for type 2 diabetes and short life in obesity Nature Adiponectin secreted from adipocytes binds to adiponectin receptors AdipoR1 and AdipoR2, and exerts antidiabetic effects via activation of AMPK and PPAR-α pathways, respectively. Levels of adiponectin in plasma are reduced in obesity, which causes insulin resistance and type 2 diabetes. Thus, orally active small molecules that bind to and activate AdipoR1 and AdipoR2 could ameliorate obesity-related diseases such as type 2 diabetes. Here we report the identification of orally active synthetic small-molecule AdipoR agonists. One of these compounds, AdipoR agonist (AdipoRon), bound to both AdipoR1 and AdipoR2 in vitro. AdipoRon showed very similar effects to adiponectin in muscle and liver, such as activation of AMPK and PPAR-α pathways, and ameliorated insulin resistance and glucose intolerance in mice fed a high-fat diet, which was completely obliterated in AdipoR1 and AdipoR2 double-knockout mice. Moreover, AdipoRon ameliorated diabetes of genetically obese rodent model db/db mice, and prolonged the shortened lifespan of db/db mice on a high-fat diet. Thus, orally active AdipoR agonists such as AdipoRon are a promising therapeutic approach for the treatment of obesity-related diseases such as type 2 diabetes. Chelation of Ferrous Sulphate Solutions by Desferrioxamine B Nature The complex formed when ferrous ions are reacted with desferrioxamine B appears to be the same as when ferric ions are reacted. The evidence is as follows: (1) Ferrous ions react with desferrioxamine B in a 1 : 1 molar ratio as estimated by the continuous variation method of Job6 as modified by Vosburgh7. Crosscurrents in HIV-1 evolution Nature Immunology Substitutions in CD8+ T cell epitopes in viral proteins can alter the complex interaction between viruses and host immunity. Reduced viral fitness and recognition of altered and subdominant epitopes are possible outcomes. Zc3h12a is an RNase essential for controlling immune responses by regulating mRNA decay Nature Toll-like receptors (TLRs) recognize microbial components, and evoke inflammation and immune responses. TLR stimulation activates complex gene expression networks that regulate the magnitude and duration of the immune reaction. Here we identify the TLR-inducible gene Zc3h12a as an immune response modifier that has an essential role in preventing immune disorders. Zc3h12a-deficient mice suffered from severe anaemia, and most died within 12 weeks. Zc3h12a-/- mice also showed augmented serum immunoglobulin levels and autoantibody production, together with a greatly increased number of plasma cells, as well as infiltration of plasma cells to the lung. Most Zc3h12a-/- splenic T cells showed effector/memory characteristics and produced interferon-γ in response to T-cell receptor stimulation. Macrophages from Zc3h12a-/- mice showed highly increased production of interleukin (IL)-6 and IL-12p40 (also known as IL12b), but not TNF, in response to TLR ligands. Although the activation of TLR signalling pathways was normal, Il6 messenger RNA decay was severely impaired in Zc3h12a-/- macrophages. Overexpression of Zc3h12a accelerated Il6 mRNA degradation via its 3′-untranslated region (UTR), and destabilized RNAs with 3′-UTRs for genes including Il6, Il12p40 and the calcitonin receptor gene Calcr. Zc3h12a contains a putative amino-terminal nuclease domain, and the expressed protein had RNase activity, consistent with a role in the decay of Il6 mRNA. Together, these results indicate that Zc3h12a is an essential RNase that prevents immune disorders by directly controlling the stability of a set of inflammatory genes. Enzymes involved in Glutamate Metabolism in Legume Root Nodules Nature The plants, which were grown in quartz sand in a greenhouse, were supplied with a nutrient solution free of nitrogen3 and with suitable strains of nodule bacteria. When the plants wore about 6 weeks old, the nodules were removed and stored at - 10 C until required. Computational Modeling of Channelrhodopsin-2 Photocurrent Characteristics in Relation to Neural Signaling Bulletin of Mathematical Biology Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Ectopic expression of glucagon-like peptide 1 for gene therapy of type II diabetes Gene Therapy Glucagon-like peptide 1 (GLP-1) is a promising candidate for the treatment of type II diabetes. However, the short in vivo half-life of GLP-1 has made peptide-based treatments challenging. Gene therapy aimed at achieving continuous GLP-1 expression presents one way to circumvent the rapid turnover of GLP-1. We have created a GLP-1 minigene that can direct the secretion of active GLP-1 (amino acids 7–37). Plasmid and adenoviral expression vectors encoding the 31-amino-acid peptide linked to leader sequences required for secretion of GLP-1 yielded sustained levels of active GLP-1 that were significantly greater than endogenous levels. Systemic administration of expression vectors to animals using two diabetic rodent models, db/db mice and Zucker Diabetic Fatty (ZDF) rats, yielded elevated GLP-1 levels that lowered both the fasting and random-fed hyperglycemia present in these animals. Because the insulinotropic actions of GLP-1 are glucose dependent, no evidence of hypoglycemia was observed. Improved glucose homeostasis was demonstrated by improvements in %HbA1c (glycated hemoglobin) and in glucose tolerance tests. GLP-1-treated animals had higher circulating insulin levels and increased insulin immunostaining of pancreatic sections. GLP-1-treated ZDF rats showed diminished food intake and, in the first few weeks following vector administration, a diminished weight gain. These results demonstrate the feasibility of gene therapy for type II diabetes using GLP-1 expression vectors. Diabetes impairs hippocampal function through glucocorticoid-mediated effects on new and mature neurons Nature Neuroscience Many organ systems are adversely affected by diabetes, including the brain, which undergoes changes that may increase the risk of cognitive decline. Although diabetes influences the hypothalamic-pituitary-adrenal axis, the role of this neuroendocrine system in diabetes-induced cognitive dysfunction remains unexplored. Here we demonstrate that, in both insulin-deficient rats and insulin-resistant mice, diabetes impairs hippocampus-dependent memory, perforant path synaptic plasticity and adult neurogenesis, and the adrenal steroid corticosterone contributes to these adverse effects. Rats treated with streptozocin have reduced insulin and show hyperglycemia, increased corticosterone, and impairments in hippocampal neurogenesis, synaptic plasticity and learning. Similar deficits are observed in db/db mice, which are characterized by insulin resistance, elevated corticosterone and obesity. Changes in hippocampal plasticity and function in both models are reversed when normal physiological levels of corticosterone are maintained, suggesting that cognitive impairment in diabetes may result from glucocorticoid-mediated deficits in neurogenesis and synaptic plasticity. The International Neuroinformatics Coordinating Facility: Evaluating the First Years Neuroinformatics Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Multiple Proteolytic Enzymes in the Human Mast Cells Nature IN several recent publications1'2 we have reported the existence of a very active proteolytic enzyme in a human spleen which contained an excessive number of mast cells. Proteolytic activity was first discovered using bovine fibrin as substrate, and subsequent experimentation revealed that mast-cell homogenates hydrolysed the ester linkage of P-toluene-sulphonyl arginine methyl ester (TAMe) and acetyltyrosine ethyl ester (ATEe) at appreciable rates3'4. Minimum information requested in the annotation of biochemical models (MIRIAM) Nature Biotechnology Most of the published quantitative models in biology are lost for the community because they are either not made available or they are insufficiently characterized to allow them to be reused. The lack of a standard description format, lack of stringent reviewing and authors' carelessness are the main causes for incomplete model descriptions. With today's increased interest in detailed biochemical models, it is necessary to define a minimum quality standard for the encoding of those models. We propose a set of rules for curating quantitative models of biological systems. These rules define procedures for encoding and annotating models represented in machine-readable form. We believe their application will enable users to (i) have confidence that curated models are an accurate reflection of their associated reference descriptions, (ii) search collections of curated models with precision, (iii) quickly identify the biological phenomena that a given curated model or model constituent represents and (iv) facilitate model reuse and composition into large subcellular models. A feedforward model for the formation of a grid field where spatial information is provided solely from place cells Biological Cybernetics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Functional roles of distributed synaptic clusters in the mitral-granule cell network of the olfactory bulb. Frontiers in integrative neuroscience Odors are encoded in spatio-temporal patterns within the olfactory bulb, but the mechanisms of odor recognition and discrimination are poorly understood. It is reasonable to postulate that the olfactory code is sculpted by lateral and feedforward inhibition mediated by granule cells onto the mitral cells. Recent viral tracing and physiological studies revealed patterns of distributed granule cell synaptic clusters that provided additional clues to the possible mechanisms at the network level. The emerging properties and functional roles of these patterns, however, are unknown. Here, using a realistic model of 5 mitral and 100 granule cells we show how their synaptic network can dynamically self-organize and interact through an activity-dependent dendrodendritic mechanism. The results suggest that the patterns of distributed mitral-granule cell connectivity may represent the most recent history of odor inputs, and may contribute to the basic processes underlying mixture perception and odor qualities. The model predicts how and why the dynamical interactions between the active mitral cells through the granule cell synaptic clusters can account for a variety of puzzling behavioral results on odor mixtures and on the emergence of synthetic or analytic perception. Virtual NEURON: a strategy for merged biochemical and electrophysiological modeling Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Abstract Because of its highly branched dendrite, the Purkinje neuron requires significant computational resources if coupled electrical and biochemical activity are to be simulated. To address this challenge, we developed a scheme for reducing the geometric complexity; while preserving the essential features of activity in both the soma and a remote dendritic spine. We merged our previously published biochemical model of calcium dynamics and lipid signaling in the Purkinje neuron, developed in the Virtual Cell modeling and simulation environment, with an electrophysiological model based on a Purkinje neuron model available in NEURON. A novel reduction method was applied to the Purkinje neuron geometry to obtain a model with fewer compartments that is tractable in Virtual Cell. Most of the dendritic tree was subject to reduction, but we retained the neuron’s explicit electrical and geometric features along a specified path from spine to soma. Further, unlike previous simplification methods, the dendrites that branch off along the preserved explicit path are retained as reduced branches. We conserved axial resistivity and adjusted passive properties and active channel conductances for the reduction in surface area, and cytosolic calcium for the reduction in volume. Rallpacks are used to validate the reduction algorithm and show that it can be generalized to other complex neuronal geometries. For the Purkinje cell, we found that current injections at the soma were able to produce similar trains of action potentials and membrane potential propagation in the full and reduced models in NEURON; the reduced model produces identical spiking patterns in NEURON and Virtual Cell. Importantly, our reduced model can simulate communication between the soma and a distal spine; an alpha function applied at the spine to represent synaptic stimulation gave similar results in the full and reduced models for potential changes associated with both the spine and the soma. Finally, we combined phosphoinositol signaling and electrophysiology in the reduced model in Virtual Cell. Thus, a strategy has been developed to combine electrophysiology and biochemistry as a step toward merging neuronal and systems biology modeling. Control of GABA Release at Mossy Fiber-CA3 Connections in the Developing Hippocampus. Frontiers in synaptic neuroscience In this review some of the recent work carried out in our laboratory concerning the functional role of GABAergic signalling at immature mossy fibres (MF)-CA3 principal cell synapses has been highlighted. While in adulthood MF, the axons of dentate gyrus granule cells release onto CA3 principal cells and interneurons glutamate, early in postnatal life they release GABA, which exerts into targeted cells a depolarizing and excitatory action. We found that GABA(A)-mediated postsynaptic currents (MF-GPSCs) exhibited a very low probability of release, were sensitive to L-AP4, a group III metabotropic glutamate receptor agonist, and revealed short-term frequency-dependent facilitation. Moreover, MF-GPSCs were down regulated by presynaptic GABA(B) and kainate receptors, activated by spillover of GABA from MF terminals and by glutamate present in the extracellular medium, respectively. Activation of these receptors contributed to the low release probability and in some cases to synapses silencing. By pairing calcium transients, associated with network-driven giant depolarizing potentials or GDPs (a hallmark of developmental networks thought to represent a primordial form of synchrony between neurons), generated by the synergistic action of glutamate and GABA with MF activation increased the probability of GABA release and caused the conversion of silent synapses into conductive ones suggesting that GDPs act as coincident detector signals for enhancing synaptic efficacy. Finally, to compare the relative strength of CA3 pyramidal cell output in relation to their MF glutamatergic or GABAergic inputs in adulthood or in postnatal development, respectively, a realistic model was constructed taking into account different biophysical properties of these synapses. Accurate and fast simulation of channel noise in conductance-based model neurons by diffusion approximation. PLoS computational biology Stochastic channel gating is the major source of intrinsic neuronal noise whose functional consequences at the microcircuit- and network-levels have been only partly explored. A systematic study of this channel noise in large ensembles of biophysically detailed model neurons calls for the availability of fast numerical methods. In fact, exact techniques employ the microscopic simulation of the random opening and closing of individual ion channels, usually based on Markov models, whose computational loads are prohibitive for next generation massive computer models of the brain. In this work, we operatively define a procedure for translating any Markov model describing voltage- or ligand-gated membrane ion-conductances into an effective stochastic version, whose computer simulation is efficient, without compromising accuracy. Our approximation is based on an improved Langevin-like approach, which employs stochastic differential equations and no Montecarlo methods. As opposed to an earlier proposal recently debated in the literature, our approximation reproduces accurately the statistical properties of the exact microscopic simulations, under a variety of conditions, from spontaneous to evoked response features. In addition, our method is not restricted to the Hodgkin-Huxley sodium and potassium currents and is general for a variety of voltage- and ligand-gated ion currents. As a by-product, the analysis of the properties emerging in exact Markov schemes by standard probability calculus enables us for the first time to analytically identify the sources of inaccuracy of the previous proposal, while providing solid ground for its modification and improvement we present here. Computer Simulation;Ion Channel Gating;Ion Channels;Markov Chains;Membrane Potentials;Models, Neurological;Neurons Emergence of physiological oscillation frequencies in a computer model of neocortex. Frontiers in computational neuroscience Coordination of neocortical oscillations has been hypothesized to underlie the "binding" essential to cognitive function. However, the mechanisms that generate neocortical oscillations in physiological frequency bands remain unknown. We hypothesized that interlaminar relations in neocortex would provide multiple intermediate loops that would play particular roles in generating oscillations, adding different dynamics to the network. We simulated networks from sensory neocortex using nine columns of event-driven rule-based neurons wired according to anatomical data and driven with random white-noise synaptic inputs. We tuned the network to achieve realistic cell firing rates and to avoid population spikes. A physiological frequency spectrum appeared as an emergent property, displaying dominant frequencies that were not present in the inputs or in the intrinsic or activated frequencies of any of the cell groups. We monitored spectral changes while using minimal dynamical perturbation as a methodology through gradual introduction of hubs into individual layers. We found that hubs in layer 2/3 excitatory cells had the greatest influence on overall network activity, suggesting that this subpopulation was a primary generator of theta/beta strength in the network. Similarly, layer 2/3 interneurons appeared largely responsible for gamma activation through preferential attenuation of the rest of the spectrum. The network showed evidence of frequency homeostasis: increased activation of supragranular layers increased firing rates in the network without altering the spectral profile, and alteration in synaptic delays did not significantly shift spectral peaks. Direct comparison of the power spectra with experimentally recorded local field potentials from prefrontal cortex of awake rat showed substantial similarities, including comparable patterns of cross-frequency coupling. Dipole characterization of single neurons from their extracellular action potentials Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Models of neocortical layer 5b pyramidal cells capturing a wide range of dendritic and perisomatic active properties. PLoS computational biology The thick-tufted layer 5b pyramidal cell extends its dendritic tree to all six layers of the mammalian neocortex and serves as a major building block for the cortical column. L5b pyramidal cells have been the subject of extensive experimental and modeling studies, yet conductance-based models of these cells that faithfully reproduce both their perisomatic Na(+)-spiking behavior as well as key dendritic active properties, including Ca(2+) spikes and back-propagating action potentials, are still lacking. Based on a large body of experimental recordings from both the soma and dendrites of L5b pyramidal cells in adult rats, we characterized key features of the somatic and dendritic firing and quantified their statistics. We used these features to constrain the density of a set of ion channels over the soma and dendritic surface via multi-objective optimization with an evolutionary algorithm, thus generating a set of detailed conductance-based models that faithfully replicate the back-propagating action potential activated Ca(2+) spike firing and the perisomatic firing response to current steps, as well as the experimental variability of the properties. Furthermore, we show a useful way to analyze model parameters with our sets of models, which enabled us to identify some of the mechanisms responsible for the dynamic properties of L5b pyramidal cells as well as mechanisms that are sensitive to morphological changes. This automated framework can be used to develop a database of faithful models for other neuron types. The models we present provide several experimentally-testable predictions and can serve as a powerful tool for theoretical investigations of the contribution of single-cell dynamics to network activity and its computational capabilities. Action Potentials;Algorithms;Animals;Computational Biology;Dendrites;Ion Channels;Models, Neurological;Neocortex;Pyramidal Cells;Rats;Rats, Wistar;Single-Cell Analysis Efficacy of synaptic inhibition depends on multiple, dynamically interacting mechanisms implicated in chloride homeostasis. PLoS computational biology Chloride homeostasis is a critical determinant of the strength and robustness of inhibition mediated by GABA(A) receptors (GABA(A)Rs). The impact of changes in steady state Cl(-) gradient is relatively straightforward to understand, but how dynamic interplay between Cl(-) influx, diffusion, extrusion and interaction with other ion species affects synaptic signaling remains uncertain. Here we used electrodiffusion modeling to investigate the nonlinear interactions between these processes. Results demonstrate that diffusion is crucial for redistributing intracellular Cl(-) load on a fast time scale, whereas Cl(-)extrusion controls steady state levels. Interaction between diffusion and extrusion can result in a somato-dendritic Cl(-) gradient even when KCC2 is distributed uniformly across the cell. Reducing KCC2 activity led to decreased efficacy of GABA(A)R-mediated inhibition, but increasing GABA(A)R input failed to fully compensate for this form of disinhibition because of activity-dependent accumulation of Cl(-). Furthermore, if spiking persisted despite the presence of GABA(A)R input, Cl(-) accumulation became accelerated because of the large Cl(-) driving force that occurs during spikes. The resulting positive feedback loop caused catastrophic failure of inhibition. Simulations also revealed other feedback loops, such as competition between Cl(-) and pH regulation. Several model predictions were tested and confirmed by [Cl(-)](i) imaging experiments. Our study has thus uncovered how Cl(-) regulation depends on a multiplicity of dynamically interacting mechanisms. Furthermore, the model revealed that enhancing KCC2 activity beyond normal levels did not negatively impact firing frequency or cause overt extracellular K(-) accumulation, demonstrating that enhancing KCC2 activity is a valid strategy for therapeutic intervention. Animals;Cell Membrane;Cells, Cultured;Chlorides;Computational Biology;Computer Simulation;Diffusion;Electrical Synapses;GABA-A Receptor Antagonists;Hippocampus;Hydrogen-Ion Concentration;Immunohistochemistry;Intracellular Space;Microscopy, Fluorescence;Models, Biological;Neurons;Potassium;Rats;Rats, Sprague-Dawley;Receptors, GABA-A;Reproducibility of Results;Sodium;Symporters;gamma-Aminobutyric Acid Relative spike time coding and STDP-based orientation selectivity in the early visual system in natural continuous and saccadic vision: a computational model Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Abstract Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and stateoftheart Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semanticallyequivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Abstract Injury to neural tissue renders voltagegated Na + (Nav) channels leaky. Even mild axonal trauma initiates Na + loading, leading to secondary Ca 2+ loading and white matter degeneration. The nodal isoform is Nav1.6 and for Nav1.6expressing HEKcells, traumatic whole cell stretch causes an immediate tetrodotoxinsensitive Na + leak. In stretchdamaged oocyte patches, Nav1.6 current undergoes damageintensity dependent hyperpolarizing (left) shifts, but whether leftshift underlies injuredaxon Navleak is uncertain. Nav1.6 inactivation (availability) is kinetically limited by (coupled to) Nav activation, yielding coupled leftshift (CLS) of the two processes: CLS should move the steadystate Nav1.6 “window conductance” closer to typical firing thresholds. Here we simulated excitability and ion homeostasis in freerunning nodes of Ranvier to assess if hallmark injuredaxon behaviors—Na + loading, ectopic excitation, propagation block—would occur with NavCLS. Intact/traumatized axolemma ratios were varied, and for some simulations Na/K pumps were included, with varied in/outside volumes. We simulated saltatory propagation with one midaxon node variously traumatized. While dissipating the [Na + ] gradient and hyperactivating the Na/K pump, NavCLS generated neuropathic painlike ectopic bursts. Depending on CLS magnitude, fraction of Nav channels affected, and pump intensity, tonic or burst firing or nodal inexcitability occurred, with [Na + ] and [K + ] fluctuating. Severe CLSinduced inexcitability did not preclude Na + loading; in fact, the steadystate Na + leaks elicited large pump currents. At a midaxon node, mild CLS perturbed normal anterograde propagation, and severe CLS blocked saltatory propagation. These results suggest that in damaged excitable cells, NavCLS could initiate cellular deterioration with attendant hyper or hypoexcitability. Healthycell versions of NavCLS, however, could contribute to physiological rhythmic firing. Abstract Lateral inhibition of cells surrounding an excited area is a key property of sensory systems, sharpening the preferential tuning of individual cells in the presence of closely related input signals. In the olfactory pathway, a dendrodendritic synaptic microcircuit between mitral and granule cells in the olfactory bulb has been proposed to mediate this type of interaction through granule cell inhibition of surrounding mitral cells. However, it is becoming evident that odor inputs result in broad activation of the olfactory bulb with interactions that go beyond neighboring cells. Using a realistic modeling approach we show how backpropagating action potentials in the long lateral dendrites of mitral cells, together with granule cell actions on mitral cells within narrow columns forming glomerular units, can provide a mechanism to activate strong local inhibition between arbitrarily distant mitral cells. The simulations predict a new role for the dendrodendritic synapses in the multicolumnar organization of the granule cells. This new paradigm gives insight into the functional significance of the patterns of connectivity revealed by recent viral tracing studies. Together they suggest a functional wiring of the olfactory bulb that could greatly expand the computational roles of the mitral–granule cell network. Abstract Spinal motor neurons have voltage gated ion channels localized in their dendrites that generate plateau potentials. The physical separation of ion channels for spiking from plateau generating channels can result in nonlinear bistable firing patterns. The physical separation and geometry of the dendrites results in asymmetric coupling between dendrites and soma that has not been addressed in reduced models of nonlinear phenomena in motor neurons. We measured voltage attenuation properties of six anatomically reconstructed and typeidentified cat spinal motor neurons to characterize asymmetric coupling between the dendrites and soma. We showed that the voltage attenuation at any distance from the soma was directiondependent and could be described as a function of the input resistance at the soma. An analytical solution for the lumped cable parameters in a twocompartment model was derived based on this finding. This is the first twocompartment modeling approach that directly derived lumped cable parameters from the geometrical and passive electrical properties of anatomically reconstructed neurons. Abstract Models for temporary information storage in neuronal populations are dominated by mechanisms directly dependent on synaptic plasticity. There are nevertheless other mechanisms available that are well suited for creating shortterm memories. Here we present a model for working memory which relies on the modulation of the intrinsic excitability properties of neurons, instead of synaptic plasticity, to retain novel information for periods of seconds to minutes. We show that it is possible to effectively use this mechanism to store the serial order in a sequence of patterns of activity. For this we introduce a functional class of neurons, named gate interneurons, which can store information in their membrane dynamics and can literally act as gates routing the flow of activations in the principal neurons population. The presented model exhibits properties which are in close agreement with experimental results in working memory. Namely, the recall process plays an important role in stabilizing and prolonging the memory trace. This means that the stored information is correctly maintained as long as it is being used. Moreover, the working memory model is adequate for storing completely new information, in time windows compatible with the notion of “oneshot” learning (hundreds of milliseconds). Abstract For the analysis of neuronal cooperativity, simultaneously recorded extracellular signals from neighboring neurons need to be sorted reliably by a spike sorting method. Many algorithms have been developed to this end, however, to date, none of them manages to fulfill a set of demanding requirements. In particular, it is desirable to have an algorithm that operates online, detects and classifies overlapping spikes in real time, and that adapts to nonstationary data. Here, we present a combined spike detection and classification algorithm, which explicitly addresses these issues. Our approach makes use of linear filters to find a new representation of the data and to optimally enhance the signaltonoise ratio. We introduce a method called “Deconfusion” which decorrelates the filter outputs and provides source separation. Finally, a set of welldefined thresholds is applied and leads to simultaneous spike detection and spike classification. By incorporating a direct feedback, the algorithm adapts to nonstationary data and is, therefore, well suited for acute recordings. We evaluate our method on simulated and experimental data, including simultaneous intra/extracellular recordings made in slices of a rat cortex and recordings from the prefrontal cortex of awake behaving macaques. We compare the results to existing spike detection as well as spike sorting methods. We conclude that our algorithm meets all of the mentioned requirements and outperforms other methods under realistic signaltonoise ratios and in the presence of overlapping spikes. Abstract Avian nucleus isthmi pars parvocellularis (Ipc) neurons are reciprocally connected with the layer 10 (L10) neurons in the optic tectum and respond with oscillatory bursts to visual stimulation. Our in vitro experiments show that both neuron types respond with regular spiking to somatic current injection and that the feedforward and feedback synaptic connections are excitatory, but of different strength and time course. To elucidate mechanisms of oscillatory bursting in this network of regularly spiking neurons, we investigated an experimentally constrained model of coupled leaky integrateandfire neurons with spikerate adaptation. The model reproduces the observed Ipc oscillatory bursting in response to simulated visual stimulation. A scan through the model parameter volume reveals that Ipc oscillatory burst generation can be caused by strong and brief feedforward synaptic conductance changes. The mechanism is sensitive to the parameter values of spikerate adaptation. In conclusion, we show that a network of regularspiking neurons with feedforward excitation and spikerate adaptation can generate oscillatory bursting in response to a constant input. Abstract Electrical stimulation of the central nervous system creates both orthodromically propagating action potentials, by stimulation of local cells and passing axons, and antidromically propagating action potentials, by stimulation of presynaptic axons and terminals. Our aim was to understand how antidromic action potentials navigate through complex arborizations, such as those of thalamic and basal ganglia afferents—sites of electrical activation during deep brain stimulation. We developed computational models to study the propagation of antidromic action potentials past the bifurcation in branched axons. In both unmyelinated and myelinated branched axons, when the diameters of each axon branch remained under a specific threshold (set by the antidromic geometric ratio), antidromic propagation occurred robustly; action potentials traveled both antidromically into the primary segment as well as “reorthodromically” into the terminal secondary segment. Propagation occurred across a broad range of stimulation frequencies, axon segment geometries, and concentrations of extracellular potassium, but was strongly dependent on the geometry of the node of Ranvier at the axonal bifurcation. Thus, antidromic activation of axon terminals can, through axon collaterals, lead to widespread activation or inhibition of targets remote from the site of stimulation. These effects should be included when interpreting the results of functional imaging or evoked potential studies on the mechanisms of action of DBS. Abstract The response of an oscillator to perturbations is described by its phaseresponse curve (PRC), which is related to the type of bifurcation leading from rest to tonic spiking. In a recent experimental study, we have shown that the type of PRC in cortical pyramidal neurons can be switched by cholinergic neuromodulation from type II (biphasic) to type I (monophasic). We explored how intrinsic mechanisms affected by acetylcholine influence the PRC using three different types of neuronal models: a theta neuron, singlecompartment neurons and a multicompartment neuron. In all of these models a decrease in the amount of a spikefrequency adaptation current was a necessary and sufficient condition for the shape of the PRC to change from biphasic (type II) to purely positive (type I). Abstract Small conductance (SK) calciumactivated potassium channels are found in many tissues throughout the body and open in response to elevations in intracellular calcium. In hippocampal neurons, SK channels are spatially colocalized with LType calcium channels. Due to the restriction of calcium transients into microdomains, only a limited number of LType Ca 2+ channels can activate SK and, thus, stochastic gating becomes relevant. Using a stochastic model with calcium microdomains, we predict that intracellular Ca 2+ fluctuations resulting from Ca 2+ channel gating can increase SK2 subthreshold activity by 1–2 orders of magnitude. This effectively reduces the value of the Hill coefficient. To explain the underlying mechanism, we show how short, highamplitude calcium pulses associated with stochastic gating of calcium channels are much more effective at activating SK2 channels than the steady calcium signal produced by a deterministic simulation. This stochastic amplification results from two factors: first, a supralinear rise in the SK2 channel’s steadystate activation curve at low calcium levels and, second, a momentary reduction in the channel’s time constant during the calcium pulse, causing the channel to approach its steadystate activation value much faster than it decays. Stochastic amplification can potentially explain subthreshold SK2 activation in unified models of both sub and suprathreshold regimes. Furthermore, we expect it to be a general phenomenon relevant to many proteins that are activated nonlinearly by stochastic ligand release. Abstract A tonicclonic seizure transitions from high frequency asynchronous activity to low frequency coherent oscillations, yet the mechanism of transition remains unknown. We propose a shift in network synchrony due to changes in cellular response. Here we use phaseresponse curves (PRC) from MorrisLecar (ML) model neurons with synaptic depression and gradually decrease input current to cells within a network simulation. This method effectively decreases firing rates resulting in a shift to greater network synchrony illustrating a possible mechanism of the transition phenomenon. PRCs are measured from the ML conductance based model cell with a range of input currents within the limit cycle. A large network of 3000 excitatory neurons is simulated with a network topology generated from secondorder statistics which allows a range of population synchrony. The population synchrony of the oscillating cells is measured with the Kuramoto order parameter, which reveals a transition from tonic to clonic phase exhibited by our model network. The cellular response shift mechanism for the tonicclonic seizure transition reproduces the population behavior closely when compared to EEG data. Abstract We have built a phenomenological spiking model of the cat early visual system comprising the retina, the Lateral Geniculate Nucleus (LGN) and V1’s layer 4, and established four main results (1) When exposed to videos that reproduce with high fidelity what a cat experiences under natural conditions, adjacent Retinal Ganglion Cells (RGCs) have spiketime correlations at a short timescale (~30 ms), despite neuronal noise and possible jitter accumulation. (2) In accordance with recent experimental findings, the LGN filters out some noise. It thus increases the spike reliability and temporal precision, the sparsity, and, importantly, further decreases down to ~15 ms adjacent cells’ correlation timescale. (3) Downstream simple cells in V1’s layer 4, if equipped with Spike TimingDependent Plasticity (STDP), may detect these finescale crosscorrelations, and thus connect principally to ON and OFFcentre cells with Receptive Fields (RF) aligned in the visual space, and thereby become orientation selective, in accordance with Hubel and Wiesel (Journal of Physiology 160:106–154, 1962 ) classic model. Up to this point we dealt with continuous vision, and there was no absolute time reference such as a stimulus onset, yet information was encoded and decoded in the relative spike times. (4) We then simulated saccades to a static image and benchmarked relative spike time coding and timetofirst spike coding w.r.t. to saccade landing in the context of orientation representation. In both the retina and the LGN, relative spike times are more precise, less affected by prelanding history and global contrast than absolute ones, and lead to robust contrast invariant orientation representations in V1. A multi-compartment model for interneurons in the dorsal lateral geniculate nucleus. PLoS computational biology GABAergic interneurons (INs) in the dorsal lateral geniculate nucleus (dLGN) shape the information flow from retina to cortex, presumably by controlling the number of visually evoked spikes in geniculate thalamocortical (TC) neurons, and refining their receptive field. The INs exhibit a rich variety of firing patterns: Depolarizing current injections to the soma may induce tonic firing, periodic bursting or an initial burst followed by tonic spiking, sometimes with prominent spike-time adaptation. When released from hyperpolarization, some INs elicit rebound bursts, while others return more passively to the resting potential. A full mechanistic understanding that explains the function of the dLGN on the basis of neuronal morphology, physiology and circuitry is currently lacking. One way to approach such an understanding is by developing a detailed mathematical model of the involved cells and their interactions. Limitations of the previous models for the INs of the dLGN region prevent an accurate representation of the conceptual framework needed to understand the computational properties of this region. We here present a detailed compartmental model of INs using, for the first time, a morphological reconstruction and a set of active dendritic conductances constrained by experimental somatic recordings from INs under several different current-clamp conditions. The model makes a number of experimentally testable predictions about the role of specific mechanisms for the firing properties observed in these neurons. In addition to accounting for the significant features of all experimental traces, it quantitatively reproduces the experimental recordings of the action-potential- firing frequency as a function of injected current. We show how and why relative differences in conductance values, rather than differences in ion channel composition, could account for the distinct differences between the responses observed in two different neurons, suggesting that INs may be individually tuned to optimize network operation under different input conditions. Action Potentials;Animals;Calcium;Computational Biology;Computer Simulation;Dendrites;Electrophysiological Phenomena;Geniculate Bodies;Interneurons;Ion Channels;Kinetics;Mice;Mice, Transgenic;Models, Neurological;Nerve Net;Patch-Clamp Techniques;Synapses A database of computational models of a half-center oscillator for analyzing how neuronal parameters influence network activity Journal of Biological Physics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Spine calcium transients induced by synaptically-evoked action potentials can predict synapse location and establish synaptic democracy. PLoS computational biology CA1 pyramidal neurons receive hundreds of synaptic inputs at different distances from the soma. Distance-dependent synaptic scaling enables distal and proximal synapses to influence the somatic membrane equally, a phenomenon called "synaptic democracy". How this is established is unclear. The backpropagating action potential (BAP) is hypothesised to provide distance-dependent information to synapses, allowing synaptic strengths to scale accordingly. Experimental measurements show that a BAP evoked by current injection at the soma causes calcium currents in the apical shaft whose amplitudes decay with distance from the soma. However, in vivo action potentials are not induced by somatic current injection but by synaptic inputs along the dendrites, which creates a different excitable state of the dendrites. Due to technical limitations, it is not possible to study experimentally whether distance information can also be provided by synaptically-evoked BAPs. Therefore we adapted a realistic morphological and electrophysiological model to measure BAP-induced voltage and calcium signals in spines after Schaffer collateral synapse stimulation. We show that peak calcium concentration is highly correlated with soma-synapse distance under a number of physiologically-realistic suprathreshold stimulation regimes and for a range of dendritic morphologies. Peak calcium levels also predicted the attenuation of the EPSP across the dendritic tree. Furthermore, we show that peak calcium can be used to set up a synaptic democracy in a homeostatic manner, whereby synapses regulate their synaptic strength on the basis of the difference between peak calcium and a uniform target value. We conclude that information derived from synaptically-generated BAPs can indicate synapse location and can subsequently be utilised to implement a synaptic democracy. Animals;CA1 Region, Hippocampal;Calcium Signaling;Computational Biology;Computer Simulation;Dendrites;Evoked Potentials;Male;Models, Neurological;Rats;Rats, Wistar;Receptors, N-Methyl-D-Aspartate;Synapses;alpha-Amino-3-hydroxy-5-methyl-4-isoxazolepropionic Acid The generation of phase differences and frequency changes in a network model of inferior olive subthreshold oscillations. PLoS computational biology It is commonly accepted that the Inferior Olive (IO) provides a timing signal to the cerebellum. Stable subthreshold oscillations in the IO can facilitate accurate timing by phase-locking spikes to the peaks of the oscillation. Several theoretical models accounting for the synchronized subthreshold oscillations have been proposed, however, two experimental observations remain an enigma. The first is the observation of frequent alterations in the frequency of the oscillations. The second is the observation of constant phase differences between simultaneously recorded neurons. In order to account for these two observations we constructed a canonical network model based on anatomical and physiological data from the IO. The constructed network is characterized by clustering of neurons with similar conductance densities, and by electrical coupling between neurons. Neurons inside a cluster are densely connected with weak strengths, while neurons belonging to different clusters are sparsely connected with stronger connections. We found that this type of network can robustly display stable subthreshold oscillations. The overall frequency of the network changes with the strength of the inter-cluster connections, and phase differences occur between neurons of different clusters. Moreover, the phase differences provide a mechanistic explanation for the experimentally observed propagating waves of activity in the IO. We conclude that the architecture of the network of electrically coupled neurons in combination with modulation of the inter-cluster coupling strengths can account for the experimentally observed frequency changes and the phase differences. Animals;Calcium;Computer Simulation;Models, Neurological;Neurons;Olivary Nucleus Parametric computation predicts a multiplicative interaction between synaptic strength parameters that control gamma oscillations. Frontiers in computational neuroscience Gamma oscillations are thought to be critical for a number of behavioral functions, they occur in many regions of the brain and through a variety of mechanisms. Fast repetitive bursting (FRB) neurons in layer 2 of the cortex are able to drive gamma oscillations over long periods of time. Even though the oscillation is driven by FRB neurons, strong feedback within the rest of the cortex must modulate properties of the oscillation such as frequency and power. We used a highly detailed model of the cortex to determine how a cohort of 33 parameters controlling synaptic drive might modulate gamma oscillation properties. We were interested in determining not just the effects of parameters individually, but we also wanted to reveal interactions between parameters beyond additive effects. To prevent a combinatorial explosion in parameter combinations that might need to be simulated, we used a fractional factorial design (FFD) that estimated the effects of individual parameters and two parameter interactions. This experiment required only 4096 model runs. We found that the largest effects on both gamma power and frequency came from a complex interaction between efficacy of synaptic connections from layer 2 inhibitory neurons to layer 2 excitatory neurons and the parameter for the reciprocal connection. As well as the effect of the individual parameters determining synaptic efficacy, there was an interaction between these parameters beyond the additive effects of the parameters alone. The magnitude of this effect was similar to that of the individual parameters, predicting that it is physiologically important in setting gamma oscillation properties. Impact of gamma-oscillatory inhibition on the signal transmission of a cortical pyramidal neuron Cognitive Neurodynamics Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Modulation of hippocampal rhythms by subthreshold electric fields and network topology Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Computational modeling reveals dendritic origins of GABA(A)-mediated excitation in CA1 pyramidal neurons. PloS one GABA is the key inhibitory neurotransmitter in the adult central nervous system, but in some circumstances can lead to a paradoxical excitation that has been causally implicated in diverse pathologies from endocrine stress responses to diseases of excitability including neuropathic pain and temporal lobe epilepsy. We undertook a computational modeling approach to determine plausible ionic mechanisms of GABA(A)-dependent excitation in isolated post-synaptic CA1 hippocampal neurons because it may constitute a trigger for pathological synchronous epileptiform discharge. In particular, the interplay intracellular chloride accumulation via the GABA(A) receptor and extracellular potassium accumulation via the K/Cl co-transporter KCC2 in promoting GABA(A)-mediated excitation is complex. Experimentally it is difficult to determine the ionic mechanisms of depolarizing current since potassium transients are challenging to isolate pharmacologically and much GABA signaling occurs in small, difficult to measure, dendritic compartments. To address this problem and determine plausible ionic mechanisms of GABA(A)-mediated excitation, we built a detailed biophysically realistic model of the CA1 pyramidal neuron that includes processes critical for ion homeostasis. Our results suggest that in dendritic compartments, but not in the somatic compartments, chloride buildup is sufficient to cause dramatic depolarization of the GABA(A) reversal potential and dominating bicarbonate currents that provide a substantial current source to drive whole-cell depolarization. The model simulations predict that extracellular K(+) transients can augment GABA(A)-mediated excitation, but not cause it. Our model also suggests the potential for GABA(A)-mediated excitation to promote network synchrony depending on interneuron synapse location - excitatory positive-feedback can occur when interneurons synapse onto distal dendritic compartments, while interneurons projecting to the perisomatic region will cause inhibition. CA1 Region, Hippocampal;Cell Polarity;Chlorides;Computer Simulation;Dendrites;Homeostasis;Potassium;Potassium Channels, Voltage-Gated;Symporters;Voltage-Gated Sodium Channels;gamma-Aminobutyric Acid Evidence for Opponent Process Analysis of Sound Source Location in Humans Journal of the Association for Research in Otolaryngology Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Abstract Because of its highly branched dendrite, the Purkinje neuron requires significant computational resources if coupled electrical and biochemical activity are to be simulated. To address this challenge, we developed a scheme for reducing the geometric complexity; while preserving the essential features of activity in both the soma and a remote dendritic spine. We merged our previously published biochemical model of calcium dynamics and lipid signaling in the Purkinje neuron, developed in the Virtual Cell modeling and simulation environment, with an electrophysiological model based on a Purkinje neuron model available in NEURON. A novel reduction method was applied to the Purkinje neuron geometry to obtain a model with fewer compartments that is tractable in Virtual Cell. Most of the dendritic tree was subject to reduction, but we retained the neuron’s explicit electrical and geometric features along a specified path from spine to soma. Further, unlike previous simplification methods, the dendrites that branch off along the preserved explicit path are retained as reduced branches. We conserved axial resistivity and adjusted passive properties and active channel conductances for the reduction in surface area, and cytosolic calcium for the reduction in volume. Rallpacks are used to validate the reduction algorithm and show that it can be generalized to other complex neuronal geometries. For the Purkinje cell, we found that current injections at the soma were able to produce similar trains of action potentials and membrane potential propagation in the full and reduced models in NEURON; the reduced model produces identical spiking patterns in NEURON and Virtual Cell. Importantly, our reduced model can simulate communication between the soma and a distal spine; an alpha function applied at the spine to represent synaptic stimulation gave similar results in the full and reduced models for potential changes associated with both the spine and the soma. Finally, we combined phosphoinositol signaling and electrophysiology in the reduced model in Virtual Cell. Thus, a strategy has been developed to combine electrophysiology and biochemistry as a step toward merging neuronal and systems biology modeling. Abstract The advent of techniques with the ability to scan massive changes in cellular makeup (genomics, proteomics, etc.) has revealed the compelling need for analytical methods to interpret and make sense of those changes. Computational models built on sound physicochemical mechanistic basis are unavoidable at the time of integrating, interpreting, and simulating highthroughput experimental data. Another powerful role of computational models is predicting new behavior provided they are adequately validated.Mitochondrial energy transduction has been traditionally studied with thermodynamic models. More recently, kinetic or thermokinetic models have been proposed, leading the path toward an understanding of the control and regulation of mitochondrial energy metabolism and its interaction with cytoplasmic and other compartments. In this work, we outline the methods, stepbystep, that should be followed to build a computational model of mitochondrial energetics in isolation or integrated to a network of cellular processes. Depending on the question addressed by the modeler, the methodology explained herein can be applied with different levels of detail, from the mitochondrial energy producing machinery in a network of cellular processes to the dynamics of a single enzyme during its catalytic cycle. Abstract The voltage and time dependence of ion channels can be regulated, notably by phosphorylation, interaction with phospholipids, and binding to auxiliary subunits. Many parameter variation studies have set conductance densities free while leaving kinetic channel properties fixed as the experimental constraints on the latter are usually better than on the former. Because individual cells can tightly regulate their ion channel properties, we suggest that kinetic parameters may be profitably set free during model optimization in order to both improve matches to data and refine kinetic parameters. To this end, we analyzed the parameter optimization of reduced models of three electrophysiologically characterized and morphologically reconstructed globus pallidus neurons. We performed two automated searches with different types of free parameters. First, conductance density parameters were set free. Even the best resulting models exhibited unavoidable problems which were due to limitations in our channel kinetics. We next set channel kinetics free for the optimized density matches and obtained significantly improved model performance. Some kinetic parameters consistently shifted to similar new values in multiple runs across three models, suggesting the possibility for tailored improvements to channel models. These results suggest that optimized channel kinetics can improve model matches to experimental voltage traces, particularly for channels characterized under different experimental conditions than recorded data to be matched by a model. The resulting shifts in channel kinetics from the original template provide valuable guidance for future experimental efforts to determine the detailed kinetics of channel isoforms and possible modulated states in particular types of neurons. Abstract Electrical synapses continuously transfer signals bidirectionally from one cell to another, directly or indirectly via intermediate cells. Electrical synapses are common in many brain structures such as the inferior olive, the subcoeruleus nucleus and the neocortex, between neurons and between glial cells. In the cortex, interneurons have been shown to be electrically coupled and proposed to participate in large, continuous cortical syncytia, as opposed to smaller spatial domains of electrically coupled cells. However, to explore the significance of these findings it is imperative to map the electrical synaptic microcircuits, in analogy with in vitro studies on monosynaptic and disynaptic chemical coupling. Since “walking” from cell to cell over large distances with a glass pipette is challenging, microinjection of (fluorescent) dyes diffusing through gapjunctions remains so far the only method available to decipher such microcircuits even though technical limitations exist. Based on circuit theory, we derive analytical descriptions of the AC electrical coupling in networks of isopotential cells. We then suggest an operative electrophysiological protocol to distinguish between direct electrical connections and connections involving one or more intermediate cells. This method allows inferring the number of intermediate cells, generalizing the conventional coupling coefficient, which provides limited information. We validate our method through computer simulations, theoretical and numerical methods and electrophysiological paired recordings. Abstract Because electrical coupling among the neurons of the brain is much faster than chemical synaptic coupling, it is natural to hypothesize that gap junctions may play a crucial role in mechanisms underlying very fast oscillations (VFOs), i.e., oscillations at more than 80 Hz. There is now a substantial body of experimental and modeling literature supporting this hypothesis. A series of modeling papers, starting with work by Roger Traub and collaborators, have suggested that VFOs may arise from expanding waves propagating through an “axonal plexus”, a large random network of electrically coupled axons. Traub et al. also proposed a cellular automaton (CA) model to study the mechanisms of VFOs in the axonal plexus. In this model, the expanding waves take the appearance of topologically circular “target patterns”. Random external stimuli initiate each wave. We therefore call this kind of VFO “externally driven”. Using a computational model, we show that an axonal plexus can also exhibit a second, distinctly different kind of VFO in a wide parameter range. These VFOs arise from activity propagating around cycles in the network. Once triggered, they persist without any source of excitation. With idealized, regular connectivity, they take the appearance of spiral waves. We call these VFOs “reentrant”. The behavior of the axonal plexus depends on the reliability with which action potentials propagate from one axon to the next, which, in turn, depends on the somatic membrane potential V s and the gap junction conductance g gj . To study these dependencies, we impose a fixed value of V s , then study the effects of varying V s and g gj . Not surprisingly, propagation becomes more reliable with rising V s and g gj . Externally driven VFOs occur when V s and g gj are so high that propagation never fails. For lower V s or g gj , propagation is nearly reliable, but fails in rare circumstances. Surprisingly, the parameter regime where this occurs is fairly large. Even a single propagation failure can trigger reentrant VFOs in this regime. Lowering V s and g gj further, one finds a third parameter regime in which propagation is unreliable, and no VFOs arise. We analyze these three parameter regimes by means of computations using model networks adapted from Traub et al., as well as much smaller model networks. Abstract Research with barn owls suggested that sound source location is represented topographically in the brain by an array of neurons each tuned to a narrow range of locations. However, research with smallheaded mammals has offered an alternative view in which location is represented by the balance of activity in two opponent channels broadly tuned to the left and right auditory space. Both channels may be present in each auditory cortex, although the channel representing contralateral space may be dominant. Recent studies have suggested that opponent channel coding of space may also apply in humans, although these studies have used a restricted set of spatial cues or probed a restricted set of spatial locations, and there have been contradictory reports as to the relative dominance of the ipsilateral and contralateral channels in each cortex. The current study used electroencephalography (EEG) in conjunction with sound field stimulus presentation to address these issues and to inform the development of an explicit computational model of human sound source localization. Neural responses were compatible with the opponent channel account of sound source localization and with contralateral channel dominance in the left, but not the right, auditory cortex. A computational opponent channel model reproduced every important aspect of the EEG data and allowed inferences about the width of tuning in the spatial channels. Moreover, the model predicted the oftreported decrease in spatial acuity measured psychophysically with increasing reference azimuth. Predictions of spatial acuity closely matched those measured psychophysically by previous authors. Reconstructing mammalian sleep dynamics with data assimilation. PLoS computational biology Data assimilation is a valuable tool in the study of any complex system, where measurements are incomplete, uncertain, or both. It enables the user to take advantage of all available information including experimental measurements and short-term model forecasts of a system. Although data assimilation has been used to study other biological systems, the study of the sleep-wake regulatory network has yet to benefit from this toolset. We present a data assimilation framework based on the unscented Kalman filter (UKF) for combining sparse measurements together with a relatively high-dimensional nonlinear computational model to estimate the state of a model of the sleep-wake regulatory system. We demonstrate with simulation studies that a few noisy variables can be used to accurately reconstruct the remaining hidden variables. We introduce a metric for ranking relative partial observability of computational models, within the UKF framework, that allows us to choose the optimal variables for measurement and also provides a methodology for optimizing framework parameters such as UKF covariance inflation. In addition, we demonstrate a parameter estimation method that allows us to track non-stationary model parameters and accommodate slow dynamics not included in the UKF filter model. Finally, we show that we can even use observed discretized sleep-state, which is not one of the model variables, to reconstruct model state and estimate unknown parameters. Sleep is implicated in many neurological disorders from epilepsy to schizophrenia, but simultaneous observation of the many brain components that regulate this behavior is difficult. We anticipate that this data assimilation framework will enable better understanding of the detailed interactions governing sleep and wake behavior and provide for better, more targeted, therapies. Algorithms;Animals;Computational Biology;Computer Simulation;Models, Biological;Rats;Sleep;Wakefulness A mechanism for robust circadian timekeeping via stoichiometric balance. Molecular systems biology Circadian (∼24 h) timekeeping is essential for the lives of many organisms. To understand the biochemical mechanisms of this timekeeping, we have developed a detailed mathematical model of the mammalian circadian clock. Our model can accurately predict diverse experimental data including the phenotypes of mutations or knockdown of clock genes as well as the time courses and relative expression of clock transcripts and proteins. Using this model, we show how a universal motif of circadian timekeeping, where repressors tightly bind activators rather than directly binding to DNA, can generate oscillations when activators and repressors are in stoichiometric balance. Furthermore, we find that an additional slow negative feedback loop preserves this stoichiometric balance and maintains timekeeping with a fixed period. The role of this mechanism in generating robust rhythms is validated by analysis of a simple and general model and a previous model of the Drosophila circadian clock. We propose a double-negative feedback loop design for biological clocks whose period needs to be tightly regulated even with large changes in gene dosage. Animals;CLOCK Proteins;Circadian Clocks;Circadian Rhythm;Drosophila;Drosophila Proteins;Gene Dosage;Gene Knockout Techniques;Mammals;Models, Biological;Mutation;Period Circadian Proteins;Phenotype;Reproducibility of Results;Transcription Factors ModelDB: making models publicly accessible to support computational neuroscience. Neuroinformatics Computer Communication Networks;Internet;Models, Neurological;Neurosciences;Publishing;Reproducibility of Results ModelDB, neuroinformatics, and computational neuroscience. Neuroinformatics Computational Biology;Computer Communication Networks;Models, Neurological;Neurons;Neurosciences;Software ModelDB: A Database to Support Computational Neuroscience. Journal of computational neuroscience Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a web-accessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Animals;Computational Biology;Computer Communication Networks;Databases, Factual;Humans;Neurosciences;Systems Integration Neural query system Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Computational simulation of the input-output relationship in hippocampal pyramidal cells Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Axonal site of spike initiation enhances auditory coincidence detection Nature Neurons initiate spikes in the axon initial segment or at the first node in the axon. However, it is not yet understood how the site of spike initiation affects neuronal activity and function. In nucleus laminaris of birds, neurons behave as coincidence detectors for sound source localization and encode interaural time differences (ITDs) separately at each characteristic frequency (CF). Here we show, in nucleus laminaris of the chick, that the site of spike initiation in the axon is arranged at a distance from the soma, so as to achieve the highest ITD sensitivity at each CF. Na+ channels were not found in the soma of high-CF (2.5–3.3 kHz) and middle-CF (1.0–2.5 kHz) neurons but were clustered within a short segment of the axon separated by 20–50 μm from the soma; in low-CF (0.4–1.0 kHz) neurons they were clustered in a longer stretch of the axon closer to the soma. Thus, neurons initiate spikes at a more remote site as the CF of neurons increases. Consequently, the somatic amplitudes of both orthodromic and antidromic spikes were small in high-CF and middle-CF neurons and were large in low-CF neurons. Computer simulation showed that the geometry of the initiation site was optimized to reduce the threshold of spike generation and to increase the ITD sensitivity at each CF. Especially in high-CF neurons, a distant localization of the spike initiation site improved the ITD sensitivity because of electrical isolation of the initiation site from the soma and dendrites, and because of reduction of Na+-channel inactivation by attenuating the temporal summation of synaptic potentials through the low-pass filtering along the axon. Antidromic propagation of action potentials in branched axons: implications for the mechanisms of action of deep brain stimulation Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Abstract Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and stateoftheart Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semanticallyequivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Abstract Injury to neural tissue renders voltagegated Na + (Nav) channels leaky. Even mild axonal trauma initiates Na + loading, leading to secondary Ca 2+ loading and white matter degeneration. The nodal isoform is Nav1.6 and for Nav1.6expressing HEKcells, traumatic whole cell stretch causes an immediate tetrodotoxinsensitive Na + leak. In stretchdamaged oocyte patches, Nav1.6 current undergoes damageintensity dependent hyperpolarizing (left) shifts, but whether leftshift underlies injuredaxon Navleak is uncertain. Nav1.6 inactivation (availability) is kinetically limited by (coupled to) Nav activation, yielding coupled leftshift (CLS) of the two processes: CLS should move the steadystate Nav1.6 “window conductance” closer to typical firing thresholds. Here we simulated excitability and ion homeostasis in freerunning nodes of Ranvier to assess if hallmark injuredaxon behaviors—Na + loading, ectopic excitation, propagation block—would occur with NavCLS. Intact/traumatized axolemma ratios were varied, and for some simulations Na/K pumps were included, with varied in/outside volumes. We simulated saltatory propagation with one midaxon node variously traumatized. While dissipating the [Na + ] gradient and hyperactivating the Na/K pump, NavCLS generated neuropathic painlike ectopic bursts. Depending on CLS magnitude, fraction of Nav channels affected, and pump intensity, tonic or burst firing or nodal inexcitability occurred, with [Na + ] and [K + ] fluctuating. Severe CLSinduced inexcitability did not preclude Na + loading; in fact, the steadystate Na + leaks elicited large pump currents. At a midaxon node, mild CLS perturbed normal anterograde propagation, and severe CLS blocked saltatory propagation. These results suggest that in damaged excitable cells, NavCLS could initiate cellular deterioration with attendant hyper or hypoexcitability. Healthycell versions of NavCLS, however, could contribute to physiological rhythmic firing. Abstract Lateral inhibition of cells surrounding an excited area is a key property of sensory systems, sharpening the preferential tuning of individual cells in the presence of closely related input signals. In the olfactory pathway, a dendrodendritic synaptic microcircuit between mitral and granule cells in the olfactory bulb has been proposed to mediate this type of interaction through granule cell inhibition of surrounding mitral cells. However, it is becoming evident that odor inputs result in broad activation of the olfactory bulb with interactions that go beyond neighboring cells. Using a realistic modeling approach we show how backpropagating action potentials in the long lateral dendrites of mitral cells, together with granule cell actions on mitral cells within narrow columns forming glomerular units, can provide a mechanism to activate strong local inhibition between arbitrarily distant mitral cells. The simulations predict a new role for the dendrodendritic synapses in the multicolumnar organization of the granule cells. This new paradigm gives insight into the functional significance of the patterns of connectivity revealed by recent viral tracing studies. Together they suggest a functional wiring of the olfactory bulb that could greatly expand the computational roles of the mitral–granule cell network. Abstract Spinal motor neurons have voltage gated ion channels localized in their dendrites that generate plateau potentials. The physical separation of ion channels for spiking from plateau generating channels can result in nonlinear bistable firing patterns. The physical separation and geometry of the dendrites results in asymmetric coupling between dendrites and soma that has not been addressed in reduced models of nonlinear phenomena in motor neurons. We measured voltage attenuation properties of six anatomically reconstructed and typeidentified cat spinal motor neurons to characterize asymmetric coupling between the dendrites and soma. We showed that the voltage attenuation at any distance from the soma was directiondependent and could be described as a function of the input resistance at the soma. An analytical solution for the lumped cable parameters in a twocompartment model was derived based on this finding. This is the first twocompartment modeling approach that directly derived lumped cable parameters from the geometrical and passive electrical properties of anatomically reconstructed neurons. Abstract Models for temporary information storage in neuronal populations are dominated by mechanisms directly dependent on synaptic plasticity. There are nevertheless other mechanisms available that are well suited for creating shortterm memories. Here we present a model for working memory which relies on the modulation of the intrinsic excitability properties of neurons, instead of synaptic plasticity, to retain novel information for periods of seconds to minutes. We show that it is possible to effectively use this mechanism to store the serial order in a sequence of patterns of activity. For this we introduce a functional class of neurons, named gate interneurons, which can store information in their membrane dynamics and can literally act as gates routing the flow of activations in the principal neurons population. The presented model exhibits properties which are in close agreement with experimental results in working memory. Namely, the recall process plays an important role in stabilizing and prolonging the memory trace. This means that the stored information is correctly maintained as long as it is being used. Moreover, the working memory model is adequate for storing completely new information, in time windows compatible with the notion of “oneshot” learning (hundreds of milliseconds). Abstract For the analysis of neuronal cooperativity, simultaneously recorded extracellular signals from neighboring neurons need to be sorted reliably by a spike sorting method. Many algorithms have been developed to this end, however, to date, none of them manages to fulfill a set of demanding requirements. In particular, it is desirable to have an algorithm that operates online, detects and classifies overlapping spikes in real time, and that adapts to nonstationary data. Here, we present a combined spike detection and classification algorithm, which explicitly addresses these issues. Our approach makes use of linear filters to find a new representation of the data and to optimally enhance the signaltonoise ratio. We introduce a method called “Deconfusion” which decorrelates the filter outputs and provides source separation. Finally, a set of welldefined thresholds is applied and leads to simultaneous spike detection and spike classification. By incorporating a direct feedback, the algorithm adapts to nonstationary data and is, therefore, well suited for acute recordings. We evaluate our method on simulated and experimental data, including simultaneous intra/extracellular recordings made in slices of a rat cortex and recordings from the prefrontal cortex of awake behaving macaques. We compare the results to existing spike detection as well as spike sorting methods. We conclude that our algorithm meets all of the mentioned requirements and outperforms other methods under realistic signaltonoise ratios and in the presence of overlapping spikes. Abstract Avian nucleus isthmi pars parvocellularis (Ipc) neurons are reciprocally connected with the layer 10 (L10) neurons in the optic tectum and respond with oscillatory bursts to visual stimulation. Our in vitro experiments show that both neuron types respond with regular spiking to somatic current injection and that the feedforward and feedback synaptic connections are excitatory, but of different strength and time course. To elucidate mechanisms of oscillatory bursting in this network of regularly spiking neurons, we investigated an experimentally constrained model of coupled leaky integrateandfire neurons with spikerate adaptation. The model reproduces the observed Ipc oscillatory bursting in response to simulated visual stimulation. A scan through the model parameter volume reveals that Ipc oscillatory burst generation can be caused by strong and brief feedforward synaptic conductance changes. The mechanism is sensitive to the parameter values of spikerate adaptation. In conclusion, we show that a network of regularspiking neurons with feedforward excitation and spikerate adaptation can generate oscillatory bursting in response to a constant input. Abstract Electrical stimulation of the central nervous system creates both orthodromically propagating action potentials, by stimulation of local cells and passing axons, and antidromically propagating action potentials, by stimulation of presynaptic axons and terminals. Our aim was to understand how antidromic action potentials navigate through complex arborizations, such as those of thalamic and basal ganglia afferents—sites of electrical activation during deep brain stimulation. We developed computational models to study the propagation of antidromic action potentials past the bifurcation in branched axons. In both unmyelinated and myelinated branched axons, when the diameters of each axon branch remained under a specific threshold (set by the antidromic geometric ratio), antidromic propagation occurred robustly; action potentials traveled both antidromically into the primary segment as well as “reorthodromically” into the terminal secondary segment. Propagation occurred across a broad range of stimulation frequencies, axon segment geometries, and concentrations of extracellular potassium, but was strongly dependent on the geometry of the node of Ranvier at the axonal bifurcation. Thus, antidromic activation of axon terminals can, through axon collaterals, lead to widespread activation or inhibition of targets remote from the site of stimulation. These effects should be included when interpreting the results of functional imaging or evoked potential studies on the mechanisms of action of DBS. Dendritic action potentials connect distributed dendrodendritic microcircuits Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Abstract Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and stateoftheart Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semanticallyequivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Abstract Injury to neural tissue renders voltagegated Na + (Nav) channels leaky. Even mild axonal trauma initiates Na + loading, leading to secondary Ca 2+ loading and white matter degeneration. The nodal isoform is Nav1.6 and for Nav1.6expressing HEKcells, traumatic whole cell stretch causes an immediate tetrodotoxinsensitive Na + leak. In stretchdamaged oocyte patches, Nav1.6 current undergoes damageintensity dependent hyperpolarizing (left) shifts, but whether leftshift underlies injuredaxon Navleak is uncertain. Nav1.6 inactivation (availability) is kinetically limited by (coupled to) Nav activation, yielding coupled leftshift (CLS) of the two processes: CLS should move the steadystate Nav1.6 “window conductance” closer to typical firing thresholds. Here we simulated excitability and ion homeostasis in freerunning nodes of Ranvier to assess if hallmark injuredaxon behaviors—Na + loading, ectopic excitation, propagation block—would occur with NavCLS. Intact/traumatized axolemma ratios were varied, and for some simulations Na/K pumps were included, with varied in/outside volumes. We simulated saltatory propagation with one midaxon node variously traumatized. While dissipating the [Na + ] gradient and hyperactivating the Na/K pump, NavCLS generated neuropathic painlike ectopic bursts. Depending on CLS magnitude, fraction of Nav channels affected, and pump intensity, tonic or burst firing or nodal inexcitability occurred, with [Na + ] and [K + ] fluctuating. Severe CLSinduced inexcitability did not preclude Na + loading; in fact, the steadystate Na + leaks elicited large pump currents. At a midaxon node, mild CLS perturbed normal anterograde propagation, and severe CLS blocked saltatory propagation. These results suggest that in damaged excitable cells, NavCLS could initiate cellular deterioration with attendant hyper or hypoexcitability. Healthycell versions of NavCLS, however, could contribute to physiological rhythmic firing. Abstract Lateral inhibition of cells surrounding an excited area is a key property of sensory systems, sharpening the preferential tuning of individual cells in the presence of closely related input signals. In the olfactory pathway, a dendrodendritic synaptic microcircuit between mitral and granule cells in the olfactory bulb has been proposed to mediate this type of interaction through granule cell inhibition of surrounding mitral cells. However, it is becoming evident that odor inputs result in broad activation of the olfactory bulb with interactions that go beyond neighboring cells. Using a realistic modeling approach we show how backpropagating action potentials in the long lateral dendrites of mitral cells, together with granule cell actions on mitral cells within narrow columns forming glomerular units, can provide a mechanism to activate strong local inhibition between arbitrarily distant mitral cells. The simulations predict a new role for the dendrodendritic synapses in the multicolumnar organization of the granule cells. This new paradigm gives insight into the functional significance of the patterns of connectivity revealed by recent viral tracing studies. Together they suggest a functional wiring of the olfactory bulb that could greatly expand the computational roles of the mitral–granule cell network. Interoperability of Neuroscience Modeling Software: Current Status and Future Directions Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. High speed two-photon imaging of calcium dynamics in dendritic spines: consequences for spine calcium kinetics and buffer capacity. PloS one Rapid calcium concentration changes in postsynaptic structures are crucial for synaptic plasticity. Thus far, the determinants of postsynaptic calcium dynamics have been studied predominantly based on the decay kinetics of calcium transients. Calcium rise times in spines in response to single action potentials (AP) are almost never measured due to technical limitations, but they could be crucial for synaptic plasticity. With high-speed, precisely-targeted, two-photon point imaging we measured both calcium rise and decay kinetics in spines and secondary dendrites in neocortical pyramidal neurons. We found that both rise and decay kinetics of changes in calcium-indicator fluorescence are about twice as fast in spines. During AP trains, spine calcium changes follow each AP, but not in dendrites. Apart from the higher surface-to-volume ratio (SVR), we observed that neocortical dendritic spines have a markedly smaller endogenous buffer capacity with respect to their parental dendrites. Calcium influx time course and calcium extrusion rate were both in the same range for spines and dendrites when fitted with a dynamic multi-compartment model that included calcium binding kinetics and diffusion. In a subsequent analysis we used this model to investigate which parameters are critical determinants in spine calcium dynamics. The model confirmed the experimental findings: a higher SVR is not sufficient by itself to explain the faster rise time kinetics in spines, but only when paired with a lower buffer capacity in spines. Simulations at zero calcium-dye conditions show that calmodulin is more efficiently activated in spines, which indicates that spine morphology and buffering conditions in neocortical spines favor synaptic plasticity. Action Potentials;Animals;Brain;Calcium;Calcium Signaling;Calmodulin;Dendritic Spines;Kinetics;Mice;Mice, Inbred C57BL;Microscopy, Fluorescence;Models, Biological;Models, Theoretical;Photons;Synapses Neuron splitting in compute-bound parallel network simulations enables runtime scaling with twice as many processors. Journal of computational neuroscience Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For compute-bound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with whole-cell balancing. Cerebral Cortex;Computer Simulation;Dentate Gyrus;Neural Networks (Computer);Neurons;Normal Distribution;Thalamic Nuclei Fully implicit parallel simulation of single neurons Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. A comparative computer simulation of dendritic morphology. PLoS computational biology Computational modeling of neuronal morphology is a powerful tool for understanding developmental processes and structure-function relationships. We present a multifaceted approach based on stochastic sampling of morphological measures from digital reconstructions of real cells. We examined how dendritic elongation, branching, and taper are controlled by three morphometric determinants: Branch Order, Radius, and Path Distance from the soma. Virtual dendrites were simulated starting from 3,715 neuronal trees reconstructed in 16 different laboratories, including morphological classes as diverse as spinal motoneurons and dentate granule cells. Several emergent morphometrics were used to compare real and virtual trees. Relating model parameters to Branch Order best constrained the number of terminations for most morphological classes, except pyramidal cell apical trees, which were better described by a dependence on Path Distance. In contrast, bifurcation asymmetry was best constrained by Radius for apical, but Path Distance for basal trees. All determinants showed similar performance in capturing total surface area, while surface area asymmetry was best determined by Path Distance. Grouping by other characteristics, such as size, asymmetry, arborizations, or animal species, showed smaller differences than observed between apical and basal, pointing to the biological importance of this separation. Hybrid models using combinations of the determinants confirmed these trends and allowed a detailed characterization of morphological relations. The differential findings between morphological groups suggest different underlying developmental mechanisms. By comparing the effects of several morphometric determinants on the simulation of different neuronal classes, this approach sheds light on possible growth mechanism variations responsible for the observed neuronal diversity. Brain;Computer Simulation;Dendrites;Models, Anatomic;Models, Neurological;Morphogenesis;Nerve Net Why are computational neuroscience and systems biology so separate? PLoS computational biology Despite similar computational approaches, there is surprisingly little interaction between the computational neuroscience and the systems biology research communities. In this review I reconstruct the history of the two disciplines and show that this may explain why they grew up apart. The separation is a pity, as both fields can learn quite a bit from each other. Several examples are given, covering sociological, software technical, and methodological aspects. Systems biology is a better organized community which is very effective at sharing resources, while computational neuroscience has more experience in multiscale modeling and the analysis of information processing by biological systems. Finally, I speculate about how the relationship between the two fields may evolve in the near future. Computer Simulation;Models, Neurological;Neurosciences;Systems Biology The effects of cholinergic neuromodulation on neuronal phase-response curves of modeled cortical neurons Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Abstract Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and stateoftheart Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semanticallyequivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Abstract Injury to neural tissue renders voltagegated Na + (Nav) channels leaky. Even mild axonal trauma initiates Na + loading, leading to secondary Ca 2+ loading and white matter degeneration. The nodal isoform is Nav1.6 and for Nav1.6expressing HEKcells, traumatic whole cell stretch causes an immediate tetrodotoxinsensitive Na + leak. In stretchdamaged oocyte patches, Nav1.6 current undergoes damageintensity dependent hyperpolarizing (left) shifts, but whether leftshift underlies injuredaxon Navleak is uncertain. Nav1.6 inactivation (availability) is kinetically limited by (coupled to) Nav activation, yielding coupled leftshift (CLS) of the two processes: CLS should move the steadystate Nav1.6 “window conductance” closer to typical firing thresholds. Here we simulated excitability and ion homeostasis in freerunning nodes of Ranvier to assess if hallmark injuredaxon behaviors—Na + loading, ectopic excitation, propagation block—would occur with NavCLS. Intact/traumatized axolemma ratios were varied, and for some simulations Na/K pumps were included, with varied in/outside volumes. We simulated saltatory propagation with one midaxon node variously traumatized. While dissipating the [Na + ] gradient and hyperactivating the Na/K pump, NavCLS generated neuropathic painlike ectopic bursts. Depending on CLS magnitude, fraction of Nav channels affected, and pump intensity, tonic or burst firing or nodal inexcitability occurred, with [Na + ] and [K + ] fluctuating. Severe CLSinduced inexcitability did not preclude Na + loading; in fact, the steadystate Na + leaks elicited large pump currents. At a midaxon node, mild CLS perturbed normal anterograde propagation, and severe CLS blocked saltatory propagation. These results suggest that in damaged excitable cells, NavCLS could initiate cellular deterioration with attendant hyper or hypoexcitability. Healthycell versions of NavCLS, however, could contribute to physiological rhythmic firing. Abstract Lateral inhibition of cells surrounding an excited area is a key property of sensory systems, sharpening the preferential tuning of individual cells in the presence of closely related input signals. In the olfactory pathway, a dendrodendritic synaptic microcircuit between mitral and granule cells in the olfactory bulb has been proposed to mediate this type of interaction through granule cell inhibition of surrounding mitral cells. However, it is becoming evident that odor inputs result in broad activation of the olfactory bulb with interactions that go beyond neighboring cells. Using a realistic modeling approach we show how backpropagating action potentials in the long lateral dendrites of mitral cells, together with granule cell actions on mitral cells within narrow columns forming glomerular units, can provide a mechanism to activate strong local inhibition between arbitrarily distant mitral cells. The simulations predict a new role for the dendrodendritic synapses in the multicolumnar organization of the granule cells. This new paradigm gives insight into the functional significance of the patterns of connectivity revealed by recent viral tracing studies. Together they suggest a functional wiring of the olfactory bulb that could greatly expand the computational roles of the mitral–granule cell network. Abstract Spinal motor neurons have voltage gated ion channels localized in their dendrites that generate plateau potentials. The physical separation of ion channels for spiking from plateau generating channels can result in nonlinear bistable firing patterns. The physical separation and geometry of the dendrites results in asymmetric coupling between dendrites and soma that has not been addressed in reduced models of nonlinear phenomena in motor neurons. We measured voltage attenuation properties of six anatomically reconstructed and typeidentified cat spinal motor neurons to characterize asymmetric coupling between the dendrites and soma. We showed that the voltage attenuation at any distance from the soma was directiondependent and could be described as a function of the input resistance at the soma. An analytical solution for the lumped cable parameters in a twocompartment model was derived based on this finding. This is the first twocompartment modeling approach that directly derived lumped cable parameters from the geometrical and passive electrical properties of anatomically reconstructed neurons. Abstract Models for temporary information storage in neuronal populations are dominated by mechanisms directly dependent on synaptic plasticity. There are nevertheless other mechanisms available that are well suited for creating shortterm memories. Here we present a model for working memory which relies on the modulation of the intrinsic excitability properties of neurons, instead of synaptic plasticity, to retain novel information for periods of seconds to minutes. We show that it is possible to effectively use this mechanism to store the serial order in a sequence of patterns of activity. For this we introduce a functional class of neurons, named gate interneurons, which can store information in their membrane dynamics and can literally act as gates routing the flow of activations in the principal neurons population. The presented model exhibits properties which are in close agreement with experimental results in working memory. Namely, the recall process plays an important role in stabilizing and prolonging the memory trace. This means that the stored information is correctly maintained as long as it is being used. Moreover, the working memory model is adequate for storing completely new information, in time windows compatible with the notion of “oneshot” learning (hundreds of milliseconds). Abstract For the analysis of neuronal cooperativity, simultaneously recorded extracellular signals from neighboring neurons need to be sorted reliably by a spike sorting method. Many algorithms have been developed to this end, however, to date, none of them manages to fulfill a set of demanding requirements. In particular, it is desirable to have an algorithm that operates online, detects and classifies overlapping spikes in real time, and that adapts to nonstationary data. Here, we present a combined spike detection and classification algorithm, which explicitly addresses these issues. Our approach makes use of linear filters to find a new representation of the data and to optimally enhance the signaltonoise ratio. We introduce a method called “Deconfusion” which decorrelates the filter outputs and provides source separation. Finally, a set of welldefined thresholds is applied and leads to simultaneous spike detection and spike classification. By incorporating a direct feedback, the algorithm adapts to nonstationary data and is, therefore, well suited for acute recordings. We evaluate our method on simulated and experimental data, including simultaneous intra/extracellular recordings made in slices of a rat cortex and recordings from the prefrontal cortex of awake behaving macaques. We compare the results to existing spike detection as well as spike sorting methods. We conclude that our algorithm meets all of the mentioned requirements and outperforms other methods under realistic signaltonoise ratios and in the presence of overlapping spikes. Abstract Avian nucleus isthmi pars parvocellularis (Ipc) neurons are reciprocally connected with the layer 10 (L10) neurons in the optic tectum and respond with oscillatory bursts to visual stimulation. Our in vitro experiments show that both neuron types respond with regular spiking to somatic current injection and that the feedforward and feedback synaptic connections are excitatory, but of different strength and time course. To elucidate mechanisms of oscillatory bursting in this network of regularly spiking neurons, we investigated an experimentally constrained model of coupled leaky integrateandfire neurons with spikerate adaptation. The model reproduces the observed Ipc oscillatory bursting in response to simulated visual stimulation. A scan through the model parameter volume reveals that Ipc oscillatory burst generation can be caused by strong and brief feedforward synaptic conductance changes. The mechanism is sensitive to the parameter values of spikerate adaptation. In conclusion, we show that a network of regularspiking neurons with feedforward excitation and spikerate adaptation can generate oscillatory bursting in response to a constant input. Abstract Electrical stimulation of the central nervous system creates both orthodromically propagating action potentials, by stimulation of local cells and passing axons, and antidromically propagating action potentials, by stimulation of presynaptic axons and terminals. Our aim was to understand how antidromic action potentials navigate through complex arborizations, such as those of thalamic and basal ganglia afferents—sites of electrical activation during deep brain stimulation. We developed computational models to study the propagation of antidromic action potentials past the bifurcation in branched axons. In both unmyelinated and myelinated branched axons, when the diameters of each axon branch remained under a specific threshold (set by the antidromic geometric ratio), antidromic propagation occurred robustly; action potentials traveled both antidromically into the primary segment as well as “reorthodromically” into the terminal secondary segment. Propagation occurred across a broad range of stimulation frequencies, axon segment geometries, and concentrations of extracellular potassium, but was strongly dependent on the geometry of the node of Ranvier at the axonal bifurcation. Thus, antidromic activation of axon terminals can, through axon collaterals, lead to widespread activation or inhibition of targets remote from the site of stimulation. These effects should be included when interpreting the results of functional imaging or evoked potential studies on the mechanisms of action of DBS. Abstract The response of an oscillator to perturbations is described by its phaseresponse curve (PRC), which is related to the type of bifurcation leading from rest to tonic spiking. In a recent experimental study, we have shown that the type of PRC in cortical pyramidal neurons can be switched by cholinergic neuromodulation from type II (biphasic) to type I (monophasic). We explored how intrinsic mechanisms affected by acetylcholine influence the PRC using three different types of neuronal models: a theta neuron, singlecompartment neurons and a multicompartment neuron. In all of these models a decrease in the amount of a spikefrequency adaptation current was a necessary and sufficient condition for the shape of the PRC to change from biphasic (type II) to purely positive (type I). Learning mechanism for column formation in the olfactory bulb. Frontiers in integrative neuroscience Sensory discrimination requires distributed arrays of processing units. In the olfactory bulb, the processing units for odor discrimination are believed to involve dendrodendritic synaptic interactions between mitral and granule cells. There is increasing anatomical evidence that these cells are organized in columns, and that the columns processing a given odor are arranged in widely distributed arrays. Experimental evidence is lacking on the underlying learning mechanisms for how these columns and arrays are formed. To gain insight into these mechanisms, we have used a simplified realistic circuit model to test the hypothesis that distributed connectivity can self-organize through an activity-dependent dendrodendritic synaptic mechanism. The results point to action potentials propagating in the mitral cell lateral dendrites as playing a critical role in this mechanism. The model predicts that columns emerge from the interaction between the local temporal dynamics of the action potentials and the synapses that they activate during dendritic propagation. The results suggest a novel and robust learning mechanism for the development of distributed processing units in a cortical structure. Multiple spike initiation zones in a neuron implicated in learning in the leech: a computational model Invertebrate Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. The influence of sodium and potassium dynamics on excitability, seizures, and the stability of persistent states: I. Single neuron dynamics Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. NEURON and Python. Frontiers in neuroinformatics The NEURON simulation program now allows Python to be used, alone or in combination with NEURON's traditional Hoc interpreter. Adding Python to NEURON has the immediate benefit of making available a very extensive suite of analysis tools written for engineering and science. It also catalyzes NEURON software development by offering users a modern programming tool that is recognized for its flexibility and power to create and maintain complex programs. At the same time, nothing is lost because all existing models written in Hoc, including graphical user interface tools, continue to work without change and are also available within the Python context. An example of the benefits of Python availability is the use of the xml module in implementing NEURON's Import3D and CellBuild tools to read MorphML and NeuroML model specifications. Diabetic Hyperglycemia Aggravates Seizures and Status Epilepticus-induced Hippocampal Damage Neurotoxicity Research Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Generating oscillatory bursts from a network of regular spiking neurons without inhibition Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Abstract Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and stateoftheart Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semanticallyequivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Abstract Injury to neural tissue renders voltagegated Na + (Nav) channels leaky. Even mild axonal trauma initiates Na + loading, leading to secondary Ca 2+ loading and white matter degeneration. The nodal isoform is Nav1.6 and for Nav1.6expressing HEKcells, traumatic whole cell stretch causes an immediate tetrodotoxinsensitive Na + leak. In stretchdamaged oocyte patches, Nav1.6 current undergoes damageintensity dependent hyperpolarizing (left) shifts, but whether leftshift underlies injuredaxon Navleak is uncertain. Nav1.6 inactivation (availability) is kinetically limited by (coupled to) Nav activation, yielding coupled leftshift (CLS) of the two processes: CLS should move the steadystate Nav1.6 “window conductance” closer to typical firing thresholds. Here we simulated excitability and ion homeostasis in freerunning nodes of Ranvier to assess if hallmark injuredaxon behaviors—Na + loading, ectopic excitation, propagation block—would occur with NavCLS. Intact/traumatized axolemma ratios were varied, and for some simulations Na/K pumps were included, with varied in/outside volumes. We simulated saltatory propagation with one midaxon node variously traumatized. While dissipating the [Na + ] gradient and hyperactivating the Na/K pump, NavCLS generated neuropathic painlike ectopic bursts. Depending on CLS magnitude, fraction of Nav channels affected, and pump intensity, tonic or burst firing or nodal inexcitability occurred, with [Na + ] and [K + ] fluctuating. Severe CLSinduced inexcitability did not preclude Na + loading; in fact, the steadystate Na + leaks elicited large pump currents. At a midaxon node, mild CLS perturbed normal anterograde propagation, and severe CLS blocked saltatory propagation. These results suggest that in damaged excitable cells, NavCLS could initiate cellular deterioration with attendant hyper or hypoexcitability. Healthycell versions of NavCLS, however, could contribute to physiological rhythmic firing. Abstract Lateral inhibition of cells surrounding an excited area is a key property of sensory systems, sharpening the preferential tuning of individual cells in the presence of closely related input signals. In the olfactory pathway, a dendrodendritic synaptic microcircuit between mitral and granule cells in the olfactory bulb has been proposed to mediate this type of interaction through granule cell inhibition of surrounding mitral cells. However, it is becoming evident that odor inputs result in broad activation of the olfactory bulb with interactions that go beyond neighboring cells. Using a realistic modeling approach we show how backpropagating action potentials in the long lateral dendrites of mitral cells, together with granule cell actions on mitral cells within narrow columns forming glomerular units, can provide a mechanism to activate strong local inhibition between arbitrarily distant mitral cells. The simulations predict a new role for the dendrodendritic synapses in the multicolumnar organization of the granule cells. This new paradigm gives insight into the functional significance of the patterns of connectivity revealed by recent viral tracing studies. Together they suggest a functional wiring of the olfactory bulb that could greatly expand the computational roles of the mitral–granule cell network. Abstract Spinal motor neurons have voltage gated ion channels localized in their dendrites that generate plateau potentials. The physical separation of ion channels for spiking from plateau generating channels can result in nonlinear bistable firing patterns. The physical separation and geometry of the dendrites results in asymmetric coupling between dendrites and soma that has not been addressed in reduced models of nonlinear phenomena in motor neurons. We measured voltage attenuation properties of six anatomically reconstructed and typeidentified cat spinal motor neurons to characterize asymmetric coupling between the dendrites and soma. We showed that the voltage attenuation at any distance from the soma was directiondependent and could be described as a function of the input resistance at the soma. An analytical solution for the lumped cable parameters in a twocompartment model was derived based on this finding. This is the first twocompartment modeling approach that directly derived lumped cable parameters from the geometrical and passive electrical properties of anatomically reconstructed neurons. Abstract Models for temporary information storage in neuronal populations are dominated by mechanisms directly dependent on synaptic plasticity. There are nevertheless other mechanisms available that are well suited for creating shortterm memories. Here we present a model for working memory which relies on the modulation of the intrinsic excitability properties of neurons, instead of synaptic plasticity, to retain novel information for periods of seconds to minutes. We show that it is possible to effectively use this mechanism to store the serial order in a sequence of patterns of activity. For this we introduce a functional class of neurons, named gate interneurons, which can store information in their membrane dynamics and can literally act as gates routing the flow of activations in the principal neurons population. The presented model exhibits properties which are in close agreement with experimental results in working memory. Namely, the recall process plays an important role in stabilizing and prolonging the memory trace. This means that the stored information is correctly maintained as long as it is being used. Moreover, the working memory model is adequate for storing completely new information, in time windows compatible with the notion of “oneshot” learning (hundreds of milliseconds). Abstract For the analysis of neuronal cooperativity, simultaneously recorded extracellular signals from neighboring neurons need to be sorted reliably by a spike sorting method. Many algorithms have been developed to this end, however, to date, none of them manages to fulfill a set of demanding requirements. In particular, it is desirable to have an algorithm that operates online, detects and classifies overlapping spikes in real time, and that adapts to nonstationary data. Here, we present a combined spike detection and classification algorithm, which explicitly addresses these issues. Our approach makes use of linear filters to find a new representation of the data and to optimally enhance the signaltonoise ratio. We introduce a method called “Deconfusion” which decorrelates the filter outputs and provides source separation. Finally, a set of welldefined thresholds is applied and leads to simultaneous spike detection and spike classification. By incorporating a direct feedback, the algorithm adapts to nonstationary data and is, therefore, well suited for acute recordings. We evaluate our method on simulated and experimental data, including simultaneous intra/extracellular recordings made in slices of a rat cortex and recordings from the prefrontal cortex of awake behaving macaques. We compare the results to existing spike detection as well as spike sorting methods. We conclude that our algorithm meets all of the mentioned requirements and outperforms other methods under realistic signaltonoise ratios and in the presence of overlapping spikes. Abstract Avian nucleus isthmi pars parvocellularis (Ipc) neurons are reciprocally connected with the layer 10 (L10) neurons in the optic tectum and respond with oscillatory bursts to visual stimulation. Our in vitro experiments show that both neuron types respond with regular spiking to somatic current injection and that the feedforward and feedback synaptic connections are excitatory, but of different strength and time course. To elucidate mechanisms of oscillatory bursting in this network of regularly spiking neurons, we investigated an experimentally constrained model of coupled leaky integrateandfire neurons with spikerate adaptation. The model reproduces the observed Ipc oscillatory bursting in response to simulated visual stimulation. A scan through the model parameter volume reveals that Ipc oscillatory burst generation can be caused by strong and brief feedforward synaptic conductance changes. The mechanism is sensitive to the parameter values of spikerate adaptation. In conclusion, we show that a network of regularspiking neurons with feedforward excitation and spikerate adaptation can generate oscillatory bursting in response to a constant input. Alternative time representation in dopamine models Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. Parameters for a model of an oscillating neuronal network in the cochlear nucleus defined by genetic algorithms Biological Cybernetics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Regulation of firing frequency in a computational model of a midbrain dopaminergic neuron Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Impact of dendritic size and dendritic topology on burst firing in pyramidal cells. PLoS computational biology Neurons display a wide range of intrinsic firing patterns. A particularly relevant pattern for neuronal signaling and synaptic plasticity is burst firing, the generation of clusters of action potentials with short interspike intervals. Besides ion-channel composition, dendritic morphology appears to be an important factor modulating firing pattern. However, the underlying mechanisms are poorly understood, and the impact of morphology on burst firing remains insufficiently known. Dendritic morphology is not fixed but can undergo significant changes in many pathological conditions. Using computational models of neocortical pyramidal cells, we here show that not only the total length of the apical dendrite but also the topological structure of its branching pattern markedly influences inter- and intraburst spike intervals and even determines whether or not a cell exhibits burst firing. We found that there is only a range of dendritic sizes that supports burst firing, and that this range is modulated by dendritic topology. Either reducing or enlarging the dendritic tree, or merely modifying its topological structure without changing total dendritic length, can transform a cell's firing pattern from bursting to tonic firing. Interestingly, the results are largely independent of whether the cells are stimulated by current injection at the soma or by synapses distributed over the dendritic tree. By means of a novel measure called mean electrotonic path length, we show that the influence of dendritic morphology on burst firing is attributable to the effect both dendritic size and dendritic topology have, not on somatic input conductance, but on the average spatial extent of the dendritic tree and the spatiotemporal dynamics of the dendritic membrane potential. Our results suggest that alterations in size or topology of pyramidal cell morphology, such as observed in Alzheimer's disease, mental retardation, epilepsy, and chronic stress, could change neuronal burst firing and thus ultimately affect information processing and cognition. Action Potentials;Animals;Cats;Computer Simulation;Dendrites;Electric Stimulation;Models, Neurological;Pyramidal Cells;Visual Cortex Synaptic information transfer in computer models of neocortical columns Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Abstract Because of its highly branched dendrite, the Purkinje neuron requires significant computational resources if coupled electrical and biochemical activity are to be simulated. To address this challenge, we developed a scheme for reducing the geometric complexity; while preserving the essential features of activity in both the soma and a remote dendritic spine. We merged our previously published biochemical model of calcium dynamics and lipid signaling in the Purkinje neuron, developed in the Virtual Cell modeling and simulation environment, with an electrophysiological model based on a Purkinje neuron model available in NEURON. A novel reduction method was applied to the Purkinje neuron geometry to obtain a model with fewer compartments that is tractable in Virtual Cell. Most of the dendritic tree was subject to reduction, but we retained the neuron’s explicit electrical and geometric features along a specified path from spine to soma. Further, unlike previous simplification methods, the dendrites that branch off along the preserved explicit path are retained as reduced branches. We conserved axial resistivity and adjusted passive properties and active channel conductances for the reduction in surface area, and cytosolic calcium for the reduction in volume. Rallpacks are used to validate the reduction algorithm and show that it can be generalized to other complex neuronal geometries. For the Purkinje cell, we found that current injections at the soma were able to produce similar trains of action potentials and membrane potential propagation in the full and reduced models in NEURON; the reduced model produces identical spiking patterns in NEURON and Virtual Cell. Importantly, our reduced model can simulate communication between the soma and a distal spine; an alpha function applied at the spine to represent synaptic stimulation gave similar results in the full and reduced models for potential changes associated with both the spine and the soma. Finally, we combined phosphoinositol signaling and electrophysiology in the reduced model in Virtual Cell. Thus, a strategy has been developed to combine electrophysiology and biochemistry as a step toward merging neuronal and systems biology modeling. Abstract The advent of techniques with the ability to scan massive changes in cellular makeup (genomics, proteomics, etc.) has revealed the compelling need for analytical methods to interpret and make sense of those changes. Computational models built on sound physicochemical mechanistic basis are unavoidable at the time of integrating, interpreting, and simulating highthroughput experimental data. Another powerful role of computational models is predicting new behavior provided they are adequately validated.Mitochondrial energy transduction has been traditionally studied with thermodynamic models. More recently, kinetic or thermokinetic models have been proposed, leading the path toward an understanding of the control and regulation of mitochondrial energy metabolism and its interaction with cytoplasmic and other compartments. In this work, we outline the methods, stepbystep, that should be followed to build a computational model of mitochondrial energetics in isolation or integrated to a network of cellular processes. Depending on the question addressed by the modeler, the methodology explained herein can be applied with different levels of detail, from the mitochondrial energy producing machinery in a network of cellular processes to the dynamics of a single enzyme during its catalytic cycle. Abstract The voltage and time dependence of ion channels can be regulated, notably by phosphorylation, interaction with phospholipids, and binding to auxiliary subunits. Many parameter variation studies have set conductance densities free while leaving kinetic channel properties fixed as the experimental constraints on the latter are usually better than on the former. Because individual cells can tightly regulate their ion channel properties, we suggest that kinetic parameters may be profitably set free during model optimization in order to both improve matches to data and refine kinetic parameters. To this end, we analyzed the parameter optimization of reduced models of three electrophysiologically characterized and morphologically reconstructed globus pallidus neurons. We performed two automated searches with different types of free parameters. First, conductance density parameters were set free. Even the best resulting models exhibited unavoidable problems which were due to limitations in our channel kinetics. We next set channel kinetics free for the optimized density matches and obtained significantly improved model performance. Some kinetic parameters consistently shifted to similar new values in multiple runs across three models, suggesting the possibility for tailored improvements to channel models. These results suggest that optimized channel kinetics can improve model matches to experimental voltage traces, particularly for channels characterized under different experimental conditions than recorded data to be matched by a model. The resulting shifts in channel kinetics from the original template provide valuable guidance for future experimental efforts to determine the detailed kinetics of channel isoforms and possible modulated states in particular types of neurons. Abstract Electrical synapses continuously transfer signals bidirectionally from one cell to another, directly or indirectly via intermediate cells. Electrical synapses are common in many brain structures such as the inferior olive, the subcoeruleus nucleus and the neocortex, between neurons and between glial cells. In the cortex, interneurons have been shown to be electrically coupled and proposed to participate in large, continuous cortical syncytia, as opposed to smaller spatial domains of electrically coupled cells. However, to explore the significance of these findings it is imperative to map the electrical synaptic microcircuits, in analogy with in vitro studies on monosynaptic and disynaptic chemical coupling. Since “walking” from cell to cell over large distances with a glass pipette is challenging, microinjection of (fluorescent) dyes diffusing through gapjunctions remains so far the only method available to decipher such microcircuits even though technical limitations exist. Based on circuit theory, we derive analytical descriptions of the AC electrical coupling in networks of isopotential cells. We then suggest an operative electrophysiological protocol to distinguish between direct electrical connections and connections involving one or more intermediate cells. This method allows inferring the number of intermediate cells, generalizing the conventional coupling coefficient, which provides limited information. We validate our method through computer simulations, theoretical and numerical methods and electrophysiological paired recordings. Abstract Because electrical coupling among the neurons of the brain is much faster than chemical synaptic coupling, it is natural to hypothesize that gap junctions may play a crucial role in mechanisms underlying very fast oscillations (VFOs), i.e., oscillations at more than 80 Hz. There is now a substantial body of experimental and modeling literature supporting this hypothesis. A series of modeling papers, starting with work by Roger Traub and collaborators, have suggested that VFOs may arise from expanding waves propagating through an “axonal plexus”, a large random network of electrically coupled axons. Traub et al. also proposed a cellular automaton (CA) model to study the mechanisms of VFOs in the axonal plexus. In this model, the expanding waves take the appearance of topologically circular “target patterns”. Random external stimuli initiate each wave. We therefore call this kind of VFO “externally driven”. Using a computational model, we show that an axonal plexus can also exhibit a second, distinctly different kind of VFO in a wide parameter range. These VFOs arise from activity propagating around cycles in the network. Once triggered, they persist without any source of excitation. With idealized, regular connectivity, they take the appearance of spiral waves. We call these VFOs “reentrant”. The behavior of the axonal plexus depends on the reliability with which action potentials propagate from one axon to the next, which, in turn, depends on the somatic membrane potential V s and the gap junction conductance g gj . To study these dependencies, we impose a fixed value of V s , then study the effects of varying V s and g gj . Not surprisingly, propagation becomes more reliable with rising V s and g gj . Externally driven VFOs occur when V s and g gj are so high that propagation never fails. For lower V s or g gj , propagation is nearly reliable, but fails in rare circumstances. Surprisingly, the parameter regime where this occurs is fairly large. Even a single propagation failure can trigger reentrant VFOs in this regime. Lowering V s and g gj further, one finds a third parameter regime in which propagation is unreliable, and no VFOs arise. We analyze these three parameter regimes by means of computations using model networks adapted from Traub et al., as well as much smaller model networks. Abstract Research with barn owls suggested that sound source location is represented topographically in the brain by an array of neurons each tuned to a narrow range of locations. However, research with smallheaded mammals has offered an alternative view in which location is represented by the balance of activity in two opponent channels broadly tuned to the left and right auditory space. Both channels may be present in each auditory cortex, although the channel representing contralateral space may be dominant. Recent studies have suggested that opponent channel coding of space may also apply in humans, although these studies have used a restricted set of spatial cues or probed a restricted set of spatial locations, and there have been contradictory reports as to the relative dominance of the ipsilateral and contralateral channels in each cortex. The current study used electroencephalography (EEG) in conjunction with sound field stimulus presentation to address these issues and to inform the development of an explicit computational model of human sound source localization. Neural responses were compatible with the opponent channel account of sound source localization and with contralateral channel dominance in the left, but not the right, auditory cortex. A computational opponent channel model reproduced every important aspect of the EEG data and allowed inferences about the width of tuning in the spatial channels. Moreover, the model predicted the oftreported decrease in spatial acuity measured psychophysically with increasing reference azimuth. Predictions of spatial acuity closely matched those measured psychophysically by previous authors. Abstract Calretinin is thought to be the main endogenous calcium buffer in cerebellar granule cells (GrCs). However, little is known about the impact of cooperative Ca 2+ binding to calretinin on highly localized and more global (regional) Ca 2+ signals in these cells. Using numerical simulations, we show that an essential property of calretinin is a delayed equilibration with Ca 2+ . Therefore, the amount of Ca 2+ , which calretinin can accumulate with respect to equilibrium levels, depends on stimulus conditions. Based on our simulations of buffered Ca 2+ diffusion near a single Ca 2+ channel or a large cluster of Ca 2+ channels and previous experimental findings that 150 μM 1,2bis(oaminophenoxy) ethane N , N , N ′, N ′tetraacetic acid (BAPTA) and endogenous calretinin have similar effects on GrC excitability, we estimated the concentration of mobile calretinin in GrCs in the range of 0.7–1.2 mM. Our results suggest that this estimate can provide a starting point for further analysis. We find that calretinin prominently reduces the action potential associated increase in cytosolic free Ca 2+ concentration ([Ca 2+ ] i ) even at a distance of 30 nm from a single Ca 2+ channel. In spite of a buildup of residual Ca 2+ , it maintains almost constant maximal [Ca 2+ ] i levels during repetitive channel openings with a frequency less than 80 Hz. This occurs because of accelerated Ca 2+ binding as calretinin binds more Ca 2+ . Unlike the buffering of high Ca 2+ levels within Ca 2+ nano/microdomains sensed by large conductance Ca 2+ activated K + channels, the buffering of regional Ca 2+ signals by calretinin can never be mimicked by certain concentration of BAPTA under all different experimental conditions. Abstract The field of Computational Systems Neurobiology is maturing quickly. If one wants it to fulfil its central role in the new Integrative Neurobiology, the reuse of quantitative models needs to be facilitated. The community has to develop standards and guidelines in order to maximise the diffusion of its scientific production, but also to render it more trustworthy. In the recent years, various projects tackled the problems of the syntax and semantics of quantitative models. More recently the international initiative BioModels.net launched three projects: (1) MIRIAM is a standard to curate and annotate models, in order to facilitate their reuse. (2) The Systems Biology Ontology is a set of controlled vocabularies aimed to be used in conjunction with models, in order to characterise their components. (3) BioModels Database is a resource that allows biologists to store, search and retrieve published mathematical models of biological interests. We expect that those resources, together with the use of formal languages such as SBML, will support the fruitful exchange and reuse of quantitative models. Abstract Understanding the direction and quantity of information flowing in neuronal networks is a fundamental problem in neuroscience. Brains and neuronal networks must at the same time store information about the world and react to information in the world. We sought to measure how the activity of the network alters information flow from inputs to output patterns. Using neocortical column neuronal network simulations, we demonstrated that networks with greater internal connectivity reduced input/output correlations from excitatory synapses and decreased negative correlations from inhibitory synapses, measured by Kendall’s τ correlation. Both of these changes were associated with reduction in information flow, measured by normalized transfer entropy ( n TE). Information handling by the network reflected the degree of internal connectivity. With no internal connectivity, the feedforward network transformed inputs through nonlinear summation and thresholding. With greater connectivity strength, the recurrent network translated activity and information due to contribution of activity from intrinsic network dynamics. This dynamic contribution amounts to added information drawn from that stored in the network. At still higher internal synaptic strength, the network corrupted the external information, producing a state where little external information came through. The association of increased information retrieved from the network with increased gamma power supports the notion of gamma oscillations playing a role in information processing. NeuroML: a language for describing data driven models of neurons and networks with a high degree of biological detail. PLoS computational biology Biologically detailed single neuron and network models are important for understanding how ion channels, synapses and anatomical connectivity underlie the complex electrical behavior of the brain. While neuronal simulators such as NEURON, GENESIS, MOOSE, NEST, and PSICS facilitate the development of these data-driven neuronal models, the specialized languages they employ are generally not interoperable, limiting model accessibility and preventing reuse of model components and cross-simulator validation. To overcome these problems we have used an Open Source software approach to develop NeuroML, a neuronal model description language based on XML (Extensible Markup Language). This enables these detailed models and their components to be defined in a standalone form, allowing them to be used across multiple simulators and archived in a standardized format. Here we describe the structure of NeuroML and demonstrate its scope by converting into NeuroML models of a number of different voltage- and ligand-gated conductances, models of electrical coupling, synaptic transmission and short-term plasticity, together with morphologically detailed models of individual neurons. We have also used these NeuroML-based components to develop an highly detailed cortical network model. NeuroML-based model descriptions were validated by demonstrating similar model behavior across five independently developed simulators. Although our results confirm that simulations run on different simulators converge, they reveal limits to model interoperability, by showing that for some models convergence only occurs at high levels of spatial and temporal discretisation, when the computational overhead is high. Our development of NeuroML as a common description language for biophysically detailed neuronal and network models enables interoperability across multiple simulation environments, thereby improving model transparency, accessibility and reuse in computational neuroscience. CA1 Region, Hippocampal;Cerebral Cortex;Computational Biology;Computer Simulation;Electrical Synapses;Humans;Models, Neurological;Nerve Net;Neurons;Reproducibility of Results;Software;Thalamus Influences of membrane properties on phase response curve and synchronization stability in a model globus pallidus neuron Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Abstract Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and stateoftheart Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semanticallyequivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Abstract Injury to neural tissue renders voltagegated Na + (Nav) channels leaky. Even mild axonal trauma initiates Na + loading, leading to secondary Ca 2+ loading and white matter degeneration. The nodal isoform is Nav1.6 and for Nav1.6expressing HEKcells, traumatic whole cell stretch causes an immediate tetrodotoxinsensitive Na + leak. In stretchdamaged oocyte patches, Nav1.6 current undergoes damageintensity dependent hyperpolarizing (left) shifts, but whether leftshift underlies injuredaxon Navleak is uncertain. Nav1.6 inactivation (availability) is kinetically limited by (coupled to) Nav activation, yielding coupled leftshift (CLS) of the two processes: CLS should move the steadystate Nav1.6 “window conductance” closer to typical firing thresholds. Here we simulated excitability and ion homeostasis in freerunning nodes of Ranvier to assess if hallmark injuredaxon behaviors—Na + loading, ectopic excitation, propagation block—would occur with NavCLS. Intact/traumatized axolemma ratios were varied, and for some simulations Na/K pumps were included, with varied in/outside volumes. We simulated saltatory propagation with one midaxon node variously traumatized. While dissipating the [Na + ] gradient and hyperactivating the Na/K pump, NavCLS generated neuropathic painlike ectopic bursts. Depending on CLS magnitude, fraction of Nav channels affected, and pump intensity, tonic or burst firing or nodal inexcitability occurred, with [Na + ] and [K + ] fluctuating. Severe CLSinduced inexcitability did not preclude Na + loading; in fact, the steadystate Na + leaks elicited large pump currents. At a midaxon node, mild CLS perturbed normal anterograde propagation, and severe CLS blocked saltatory propagation. These results suggest that in damaged excitable cells, NavCLS could initiate cellular deterioration with attendant hyper or hypoexcitability. Healthycell versions of NavCLS, however, could contribute to physiological rhythmic firing. Abstract Lateral inhibition of cells surrounding an excited area is a key property of sensory systems, sharpening the preferential tuning of individual cells in the presence of closely related input signals. In the olfactory pathway, a dendrodendritic synaptic microcircuit between mitral and granule cells in the olfactory bulb has been proposed to mediate this type of interaction through granule cell inhibition of surrounding mitral cells. However, it is becoming evident that odor inputs result in broad activation of the olfactory bulb with interactions that go beyond neighboring cells. Using a realistic modeling approach we show how backpropagating action potentials in the long lateral dendrites of mitral cells, together with granule cell actions on mitral cells within narrow columns forming glomerular units, can provide a mechanism to activate strong local inhibition between arbitrarily distant mitral cells. The simulations predict a new role for the dendrodendritic synapses in the multicolumnar organization of the granule cells. This new paradigm gives insight into the functional significance of the patterns of connectivity revealed by recent viral tracing studies. Together they suggest a functional wiring of the olfactory bulb that could greatly expand the computational roles of the mitral–granule cell network. Abstract Spinal motor neurons have voltage gated ion channels localized in their dendrites that generate plateau potentials. The physical separation of ion channels for spiking from plateau generating channels can result in nonlinear bistable firing patterns. The physical separation and geometry of the dendrites results in asymmetric coupling between dendrites and soma that has not been addressed in reduced models of nonlinear phenomena in motor neurons. We measured voltage attenuation properties of six anatomically reconstructed and typeidentified cat spinal motor neurons to characterize asymmetric coupling between the dendrites and soma. We showed that the voltage attenuation at any distance from the soma was directiondependent and could be described as a function of the input resistance at the soma. An analytical solution for the lumped cable parameters in a twocompartment model was derived based on this finding. This is the first twocompartment modeling approach that directly derived lumped cable parameters from the geometrical and passive electrical properties of anatomically reconstructed neurons. Abstract Models for temporary information storage in neuronal populations are dominated by mechanisms directly dependent on synaptic plasticity. There are nevertheless other mechanisms available that are well suited for creating shortterm memories. Here we present a model for working memory which relies on the modulation of the intrinsic excitability properties of neurons, instead of synaptic plasticity, to retain novel information for periods of seconds to minutes. We show that it is possible to effectively use this mechanism to store the serial order in a sequence of patterns of activity. For this we introduce a functional class of neurons, named gate interneurons, which can store information in their membrane dynamics and can literally act as gates routing the flow of activations in the principal neurons population. The presented model exhibits properties which are in close agreement with experimental results in working memory. Namely, the recall process plays an important role in stabilizing and prolonging the memory trace. This means that the stored information is correctly maintained as long as it is being used. Moreover, the working memory model is adequate for storing completely new information, in time windows compatible with the notion of “oneshot” learning (hundreds of milliseconds). Abstract For the analysis of neuronal cooperativity, simultaneously recorded extracellular signals from neighboring neurons need to be sorted reliably by a spike sorting method. Many algorithms have been developed to this end, however, to date, none of them manages to fulfill a set of demanding requirements. In particular, it is desirable to have an algorithm that operates online, detects and classifies overlapping spikes in real time, and that adapts to nonstationary data. Here, we present a combined spike detection and classification algorithm, which explicitly addresses these issues. Our approach makes use of linear filters to find a new representation of the data and to optimally enhance the signaltonoise ratio. We introduce a method called “Deconfusion” which decorrelates the filter outputs and provides source separation. Finally, a set of welldefined thresholds is applied and leads to simultaneous spike detection and spike classification. By incorporating a direct feedback, the algorithm adapts to nonstationary data and is, therefore, well suited for acute recordings. We evaluate our method on simulated and experimental data, including simultaneous intra/extracellular recordings made in slices of a rat cortex and recordings from the prefrontal cortex of awake behaving macaques. We compare the results to existing spike detection as well as spike sorting methods. We conclude that our algorithm meets all of the mentioned requirements and outperforms other methods under realistic signaltonoise ratios and in the presence of overlapping spikes. Abstract Avian nucleus isthmi pars parvocellularis (Ipc) neurons are reciprocally connected with the layer 10 (L10) neurons in the optic tectum and respond with oscillatory bursts to visual stimulation. Our in vitro experiments show that both neuron types respond with regular spiking to somatic current injection and that the feedforward and feedback synaptic connections are excitatory, but of different strength and time course. To elucidate mechanisms of oscillatory bursting in this network of regularly spiking neurons, we investigated an experimentally constrained model of coupled leaky integrateandfire neurons with spikerate adaptation. The model reproduces the observed Ipc oscillatory bursting in response to simulated visual stimulation. A scan through the model parameter volume reveals that Ipc oscillatory burst generation can be caused by strong and brief feedforward synaptic conductance changes. The mechanism is sensitive to the parameter values of spikerate adaptation. In conclusion, we show that a network of regularspiking neurons with feedforward excitation and spikerate adaptation can generate oscillatory bursting in response to a constant input. Abstract Electrical stimulation of the central nervous system creates both orthodromically propagating action potentials, by stimulation of local cells and passing axons, and antidromically propagating action potentials, by stimulation of presynaptic axons and terminals. Our aim was to understand how antidromic action potentials navigate through complex arborizations, such as those of thalamic and basal ganglia afferents—sites of electrical activation during deep brain stimulation. We developed computational models to study the propagation of antidromic action potentials past the bifurcation in branched axons. In both unmyelinated and myelinated branched axons, when the diameters of each axon branch remained under a specific threshold (set by the antidromic geometric ratio), antidromic propagation occurred robustly; action potentials traveled both antidromically into the primary segment as well as “reorthodromically” into the terminal secondary segment. Propagation occurred across a broad range of stimulation frequencies, axon segment geometries, and concentrations of extracellular potassium, but was strongly dependent on the geometry of the node of Ranvier at the axonal bifurcation. Thus, antidromic activation of axon terminals can, through axon collaterals, lead to widespread activation or inhibition of targets remote from the site of stimulation. These effects should be included when interpreting the results of functional imaging or evoked potential studies on the mechanisms of action of DBS. Abstract The response of an oscillator to perturbations is described by its phaseresponse curve (PRC), which is related to the type of bifurcation leading from rest to tonic spiking. In a recent experimental study, we have shown that the type of PRC in cortical pyramidal neurons can be switched by cholinergic neuromodulation from type II (biphasic) to type I (monophasic). We explored how intrinsic mechanisms affected by acetylcholine influence the PRC using three different types of neuronal models: a theta neuron, singlecompartment neurons and a multicompartment neuron. In all of these models a decrease in the amount of a spikefrequency adaptation current was a necessary and sufficient condition for the shape of the PRC to change from biphasic (type II) to purely positive (type I). Abstract Small conductance (SK) calciumactivated potassium channels are found in many tissues throughout the body and open in response to elevations in intracellular calcium. In hippocampal neurons, SK channels are spatially colocalized with LType calcium channels. Due to the restriction of calcium transients into microdomains, only a limited number of LType Ca 2+ channels can activate SK and, thus, stochastic gating becomes relevant. Using a stochastic model with calcium microdomains, we predict that intracellular Ca 2+ fluctuations resulting from Ca 2+ channel gating can increase SK2 subthreshold activity by 1–2 orders of magnitude. This effectively reduces the value of the Hill coefficient. To explain the underlying mechanism, we show how short, highamplitude calcium pulses associated with stochastic gating of calcium channels are much more effective at activating SK2 channels than the steady calcium signal produced by a deterministic simulation. This stochastic amplification results from two factors: first, a supralinear rise in the SK2 channel’s steadystate activation curve at low calcium levels and, second, a momentary reduction in the channel’s time constant during the calcium pulse, causing the channel to approach its steadystate activation value much faster than it decays. Stochastic amplification can potentially explain subthreshold SK2 activation in unified models of both sub and suprathreshold regimes. Furthermore, we expect it to be a general phenomenon relevant to many proteins that are activated nonlinearly by stochastic ligand release. Abstract A tonicclonic seizure transitions from high frequency asynchronous activity to low frequency coherent oscillations, yet the mechanism of transition remains unknown. We propose a shift in network synchrony due to changes in cellular response. Here we use phaseresponse curves (PRC) from MorrisLecar (ML) model neurons with synaptic depression and gradually decrease input current to cells within a network simulation. This method effectively decreases firing rates resulting in a shift to greater network synchrony illustrating a possible mechanism of the transition phenomenon. PRCs are measured from the ML conductance based model cell with a range of input currents within the limit cycle. A large network of 3000 excitatory neurons is simulated with a network topology generated from secondorder statistics which allows a range of population synchrony. The population synchrony of the oscillating cells is measured with the Kuramoto order parameter, which reveals a transition from tonic to clonic phase exhibited by our model network. The cellular response shift mechanism for the tonicclonic seizure transition reproduces the population behavior closely when compared to EEG data. Abstract We have built a phenomenological spiking model of the cat early visual system comprising the retina, the Lateral Geniculate Nucleus (LGN) and V1’s layer 4, and established four main results (1) When exposed to videos that reproduce with high fidelity what a cat experiences under natural conditions, adjacent Retinal Ganglion Cells (RGCs) have spiketime correlations at a short timescale (~30 ms), despite neuronal noise and possible jitter accumulation. (2) In accordance with recent experimental findings, the LGN filters out some noise. It thus increases the spike reliability and temporal precision, the sparsity, and, importantly, further decreases down to ~15 ms adjacent cells’ correlation timescale. (3) Downstream simple cells in V1’s layer 4, if equipped with Spike TimingDependent Plasticity (STDP), may detect these finescale crosscorrelations, and thus connect principally to ON and OFFcentre cells with Receptive Fields (RF) aligned in the visual space, and thereby become orientation selective, in accordance with Hubel and Wiesel (Journal of Physiology 160:106–154, 1962 ) classic model. Up to this point we dealt with continuous vision, and there was no absolute time reference such as a stimulus onset, yet information was encoded and decoded in the relative spike times. (4) We then simulated saccades to a static image and benchmarked relative spike time coding and timetofirst spike coding w.r.t. to saccade landing in the context of orientation representation. In both the retina and the LGN, relative spike times are more precise, less affected by prelanding history and global contrast than absolute ones, and lead to robust contrast invariant orientation representations in V1. Abstract The activity patterns of the globus pallidus (GPe) and subthalamic nucleus (STN) are closely associated with motor function and dysfunction in the basal ganglia. In the pathological state caused by dopamine depletion, the STN–GPe network exhibits rhythmic synchronous activity accompanied by rebound bursts in the STN. Therefore, the mechanism of activity transition is a key to understand basal ganglia functions. As synchronization in GPe neurons could induce pathological STN rebound bursts, it is important to study how synchrony is generated in the GPe. To clarify this issue, we applied the phasereduction technique to a conductancebased GPe neuronal model in order to derive the phase response curve (PRC) and interaction function between coupled GPe neurons. Using the PRC and interaction function, we studied how the steadystate activity of the GPe network depends on intrinsic membrane properties, varying ionic conductances on the membrane. We noted that a change in persistent sodium current, fast delayed rectifier Kv3 potassium current, Mtype potassium current and small conductance calciumdependent potassium current influenced the PRC shape and the steady state. The effect of those currents on the PRC shape could be attributed to extension of the firing period and reduction of the phase response immediately after an action potential. In particular, the slow potassium current arising from the Mtype potassium and the SK current was responsible for the reduction of the phase response. These results suggest that the membrane property modulation controls synchronization/asynchronization in the GPe and the pathological pattern of STN–GPe activity. Differential effects of Kv7 (M-) channels on synaptic integration in distinct subcellular compartments of rat hippocampal pyramidal neurons. The Journal of physiology The K(V)7/M-current is an important determinant of neuronal excitability and plays a critical role in modulating action potential firing. In this study, using a combination of electrophysiology and computational modelling, we show that these channels selectively influence peri-somatic but not dendritic post-synaptic excitatory synaptic potential (EPSP) integration in CA1 pyramidal cells. K(V)7/M-channels are highly concentrated in axons. However, the competing peptide, ankyrin G binding peptide (ABP) that disrupts axonal K(V)7/M-channel function, had little effect on somatic EPSP integration, suggesting that this effect was due to local somatic channels only. This interpretation was confirmed using computer simulations. Further, in accordance with the biophysical properties of the K(V)7/M-current, the effect of somatic K(V)7/M-channels on synaptic potential summation was dependent upon the neuronal membrane potential. Somatic K(V)7/M-channels thus affect EPSP-spike coupling by altering EPSP integration. Interestingly, disruption of axonal channels enhanced EPSP-spike coupling by lowering the action potential threshold. Hence, somatic and axonal K(V)7/M-channels influence EPSP-spike coupling via different mechanisms. This may be important for their relative contributions to physiological processes such as synaptic plasticity as well as patho-physiological conditions such as epilepsy. Action Potentials;Animals;Axons;Dendrites;Excitatory Postsynaptic Potentials;KCNQ1 Potassium Channel;Oligopeptides;Pyramidal Cells;Rats;Synapses Spike-timing dependent plasticity and feed-forward input oscillations produce precise and invariant spike phase-locking. Frontiers in computational neuroscience In the hippocampus and the neocortex, the coupling between local field potential (LFP) oscillations and the spiking of single neurons can be highly precise, across neuronal populations and cell types. Spike phase (i.e., the spike time with respect to a reference oscillation) is known to carry reliable information, both with phase-locking behavior and with more complex phase relationships, such as phase precession. How this precision is achieved by neuronal populations, whose membrane properties and total input may be quite heterogeneous, is nevertheless unknown. In this note, we investigate a simple mechanism for learning precise LFP-to-spike coupling in feed-forward networks - the reliable, periodic modulation of presynaptic firing rates during oscillations, coupled with spike-timing dependent plasticity. When oscillations are within the biological range (2-150 Hz), firing rates of the inputs change on a timescale highly relevant to spike-timing dependent plasticity (STDP). Through analytic and computational methods, we find points of stable phase-locking for a neuron with plastic input synapses. These points correspond to precise phase-locking behavior in the feed-forward network. The location of these points depends on the oscillation frequency of the inputs, the STDP time constants, and the balance of potentiation and de-potentiation in the STDP rule. For a given input oscillation, the balance of potentiation and de-potentiation in the STDP rule is the critical parameter that determines the phase at which an output neuron will learn to spike. These findings are robust to changes in intrinsic post-synaptic properties. Finally, we discuss implications of this mechanism for stable learning of spike-timing in the hippocampus. Conserved properties of dendritic trees in four cortical interneuron subtypes Scientific Reports Dendritic trees influence synaptic integration and neuronal excitability, yet appear to develop in rather arbitrary patterns. Using electron microscopy and serial reconstructions, we analyzed the dendritic trees of four morphologically distinct neocortical interneuron subtypes to reveal two underlying organizational principles common to all. First, cross-sectional areas at any given point within a dendrite were proportional to the summed length of all dendritic segments distal to that point. Consistent with this observation, total cross-sectional area was almost perfectly conserved at bifurcation points. Second, dendritic cross-sections became progressively more elliptical at more proximal, larger diameter, dendritic locations. Finally, computer simulations revealed that these conserved morphological features limit distance dependent filtering of somatic EPSPs and facilitate distribution of somatic depolarization into all dendritic compartments. Because these features were shared by all interneurons studied, they may represent common organizational principles underlying the otherwise diverse morphology of dendritic trees. Shortest Loops are Pacemakers in Random Networks of Electrically Coupled Axons. Frontiers in computational neuroscience High-frequency oscillations (HFOs) are an important part of brain activity in health and disease. However, their origins remain obscure and controversial. One possible mechanism depends on the presence of sparsely distributed gap junctions that electrically couple the axons of principal cells. A plexus of electrically coupled axons is modeled as a random network with bi-directional connections between its nodes. Under certain conditions the network can demonstrate one of two types of oscillatory activity. Type I oscillations (100-200 Hz) are predicted to be caused by spontaneously spiking axons in a network with strong (high conductance) gap junctions. Type II oscillations (200-300 Hz) require no spontaneous spiking and relatively weak (low-conductance) gap junctions, across which spike propagation failures occur. The type II oscillations are reentrant and self-sustained. Here we examine what determines the frequency of type II oscillations. Using simulations we show that the distribution of loop lengths is the key factor for determining frequency in type II network oscillations. We first analyze spike failure between two electrically coupled cells using a model of anatomically reconstructed CA1 pyramidal neuron. Then network oscillations are studied by a cellular automaton model with random network connectivity, in which we control loop statistics. We show that oscillation periods can be predicted from the network's loop statistics. The shortest loop, around which a spike can travel, is the most likely pacemaker candidate. The principle of one loop as a pacemaker is remarkable, because random networks contain a large number of loops juxtaposed and superimposed, and their number rapidly grows with network size. This principle allows us to predict the frequency of oscillations from network connectivity and visa versa. We finally propose that type I oscillations may correspond to ripples, while type II oscillations correspond to so-called fast ripples. Models of grid cell spatial firing published 2005-2011. Frontiers in neural circuits Since the discovery of grid cells in rat entorhinal cortex, many models of their hexagonally arrayed spatial firing fields have been suggested. We review the models and organize them according to the mechanisms they use to encode position, update the positional code, read it out in the spatial grid pattern, and learn any patterned synaptic connections needed. We mention biological implementations of the models, but focus on the models on Marr's algorithmic level, where they are not things to individually prove or disprove, but rather are a valuable collection of metaphors of the grid cell system for guiding research that are all likely true to some degree, with each simply emphasizing different aspects of the system. For the convenience of interested researchers, MATLAB implementations of the discussed grid cell models are provided at ModelDB accession 144006 or http://people.bu.edu/zilli/gridmodels.html. Electrical advantages of dendritic spines. PloS one Many neurons receive excitatory glutamatergic input almost exclusively onto dendritic spines. In the absence of spines, the amplitudes and kinetics of excitatory postsynaptic potentials (EPSPs) at the site of synaptic input are highly variable and depend on dendritic location. We hypothesized that dendritic spines standardize the local geometry at the site of synaptic input, thereby reducing location-dependent variability of local EPSP properties. We tested this hypothesis using computational models of simplified and morphologically realistic spiny neurons that allow direct comparison of EPSPs generated on spine heads with EPSPs generated on dendritic shafts at the same dendritic locations. In all morphologies tested, spines greatly reduced location-dependent variability of local EPSP amplitude and kinetics, while having minimal impact on EPSPs measured at the soma. Spine-dependent standardization of local EPSP properties persisted across a range of physiologically relevant spine neck resistances, and in models with variable neck resistances. By reducing the variability of local EPSPs, spines standardized synaptic activation of NMDA receptors and voltage-gated calcium channels. Furthermore, spines enhanced activation of NMDA receptors and facilitated the generation of NMDA spikes and axonal action potentials in response to synaptic input. Finally, we show that dynamic regulation of spine neck geometry can preserve local EPSP properties following plasticity-driven changes in synaptic strength, but is inefficient in modifying the amplitude of EPSPs in other cellular compartments. These observations suggest that one function of dendritic spines is to standardize local EPSP properties throughout the dendritic tree, thereby allowing neurons to use similar voltage-sensitive postsynaptic mechanisms at all dendritic locations. Action Potentials;Computer Simulation;Dendritic Spines;Electric Stimulation;Excitatory Postsynaptic Potentials;Models, Neurological;Neurons;Receptors, N-Methyl-D-Aspartate;Synapses;Synaptic Transmission Predictive features of persistent activity emergence in regular spiking and intrinsic bursting model neurons. PLoS computational biology Proper functioning of working memory involves the expression of stimulus-selective persistent activity in pyramidal neurons of the prefrontal cortex (PFC), which refers to neural activity that persists for seconds beyond the end of the stimulus. The mechanisms which PFC pyramidal neurons use to discriminate between preferred vs. neutral inputs at the cellular level are largely unknown. Moreover, the presence of pyramidal cell subtypes with different firing patterns, such as regular spiking and intrinsic bursting, raises the question as to what their distinct role might be in persistent firing in the PFC. Here, we use a compartmental modeling approach to search for discriminatory features in the properties of incoming stimuli to a PFC pyramidal neuron and/or its response that signal which of these stimuli will result in persistent activity emergence. Furthermore, we use our modeling approach to study cell-type specific differences in persistent activity properties, via implementing a regular spiking (RS) and an intrinsic bursting (IB) model neuron. We identify synaptic location within the basal dendrites as a feature of stimulus selectivity. Specifically, persistent activity-inducing stimuli consist of activated synapses that are located more distally from the soma compared to non-inducing stimuli, in both model cells. In addition, the action potential (AP) latency and the first few inter-spike-intervals of the neuronal response can be used to reliably detect inducing vs. non-inducing inputs, suggesting a potential mechanism by which downstream neurons can rapidly decode the upcoming emergence of persistent activity. While the two model neurons did not differ in the coding features of persistent activity emergence, the properties of persistent activity, such as the firing pattern and the duration of temporally-restricted persistent activity were distinct. Collectively, our results pinpoint to specific features of the neuronal response to a given stimulus that code for its ability to induce persistent activity and predict differential roles of RS and IB neurons in persistent activity expression. Action Potentials;Biological Clocks;Computer Simulation;Humans;Memory;Models, Neurological;Neurons;Pyramidal Cells;Synaptic Transmission A fast model of voltage-dependent NMDA receptors Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Fast and accurate low-dimensional reduction of biophysically detailed neuron models Scientific Reports Realistic modeling of neurons are quite successful in complementing traditional experimental techniques. However, their networks require a computational power beyond the capabilities of current supercomputers, and the methods used so far to reduce their complexity do not take into account the key features of the cells nor critical physiological properties. Here we introduce a new, automatic and fast method to map realistic neurons into equivalent reduced models running up to > 40 times faster while maintaining a very high accuracy of the membrane potential dynamics during synaptic inputs, and a direct link with experimental observables. The mapping of arbitrary sets of synaptic inputs, without additional fine tuning, would also allow the convenient and efficient implementation of a new generation of large-scale simulations of brain regions reproducing the biological variability observed in real neurons, with unprecedented advances to understand higher brain functions. A minimal mechanistic model for temporal signal processing in the lateral geniculate nucleus Cognitive Neurodynamics Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Abstract Cortical gamma frequency (30–80 Hz) oscillations have been suggested to underlie many aspects of cognitive functions. In this paper we compare the $$fI$$ curves modulated by gammafrequencymodulated stimulus and Poisson synaptic input at distal dendrites of a layer V pyramidal neuron model. The results show that gammafrequency distal input amplifies the sensitivity of neural response to basal input, and enhances gain modulation of the neuron. Abstract Inward rectifying potassium (K IR ) currents in medium spiny (MS) neurons of nucleus accumbens inactivate significantly in ~40% of the neurons but not in the rest, which may lead to differences in input processing by these two groups. Using a 189compartment computational model of the MS neuron, we investigate the influence of this property using injected current as well as spatiotemporally distributed synaptic inputs. Our study demonstrates that K IR current inactivation facilitates depolarization, firing frequency and firing onset in these neurons. These effects may be attributed to the higher input resistance of the cell as well as a more depolarized resting/downstate potential induced by the inactivation of this current. In view of the reports that dendritic intracellular calcium levels depend closely on burst strength and spike onset time, our findings suggest that inactivation of K IR currents may offer a means of modulating both excitability and synaptic plasticity in MS neurons. Abstract Epileptic seizures in diabetic hyperglycemia (DH) are not uncommon. This study aimed to determine the acute behavioral, pathological, and electrophysiological effects of status epilepticus (SE) on diabetic animals. Adult male SpragueDawley rats were first divided into groups with and without streptozotocin (STZ)induced diabetes, and then into treatment groups given a normal saline (NS) (STZonly and NSonly) or a lithiumpilocarpine injection to induce status epilepticus (STZ + SE and NS + SE). Seizure susceptibility, severity, and mortality were evaluated. Serial Morris water maze test and hippocampal histopathology results were examined before and 24 h after SE. Tetanic stimulationinduced longterm potentiation (LTP) in a hippocampal slice was recorded in a multielectrode dish system. We also used a simulation model to evaluate intracellular adenosine triphosphate (ATP) and neuroexcitability. The STZ + SE group had a significantly higher percentage of severe seizures and SErelated death and worse learning and memory performances than the other three groups 24 h after SE. The STZ + SE group, and then the NS + SE group, showed the most severe neuronal loss and mossy fiber sprouting in the hippocampal CA3 area. In addition, LTP was markedly attenuated in the STZ + SE group, and then the NS + SE group. In the simulation, increased intracellular ATP concentration promoted action potential firing. This finding that rats with DH had more brain damage after SE than rats without diabetes suggests the importance of intensively treating hyperglycemia and seizures in diabetic patients with epilepsy. Neuroinformatics is a multifaceted field. It is as broad as the field of neuroscience. The various domains of NI may also share some common features such as databases, data mining systems, and data modeling tools. NI projects are often coordinated by user groups or research organizations. Largescale infrastructure supporting NI development is also a vital aspect of the field. Abstract Channelrhodopsins2 (ChR2) are a class of light sensitive proteins that offer the ability to use light stimulation to regulate neural activity with millisecond precision. In order to address the limitations in the efficacy of the wildtype ChR2 (ChRwt) to achieve this objective, new variants of ChR2 that exhibit fast monexponential photocurrent decay characteristics have been recently developed and validated. In this paper, we investigate whether the framework of transition rate model with 4 states, primarily developed to mimic the biexponential photocurrent decay kinetics of ChRwt, as opposed to the low complexity 3 state model, is warranted to mimic the monoexponential photocurrent decay kinetics of the newly developed fast ChR2 variants: ChETA (Gunaydin et al., Nature Neurosci. 13:387–392, 2010 ) and ChRET/TC (Berndt et al., Proc. Natl. Acad. Sci. 108:7595–7600, 2011 ). We begin by estimating the parameters of the 3state and 4state models from experimental data on the photocurrent kinetics of ChRwt, ChETA, and ChRET/TC. We then incorporate these models into a fastspiking interneuron model (Wang and Buzsaki, J. Neurosci. 16:6402–6413, 1996 ) and a hippocampal pyramidal cell model (Golomb et al., J. Neurophysiol. 96:1912–1926, 2006 ) and investigate the extent to which the experimentally observed neural response to various optostimulation protocols can be captured by these models. We demonstrate that for all ChR2 variants investigated, the 4 state model implementation is better able to capture neural response consistent with experiments across wide range of optostimulation protocol. We conclude by analytically investigating the conditions under which the characteristic specific to the 3state model, namely the monoexponential photocurrent decay of the newly developed variants of ChR2, can occur in the framework of the 4state model. Abstract In cerebellar Purkinje cells, the β4subunit of voltagedependent Na + channels has been proposed to serve as an openchannel blocker giving rise to a “resurgent” Na + current ( I NaR ) upon membrane repolarization. Notably, the β4subunit was recently identified as a novel substrate of the βsecretase, BACE1, a key enzyme of the amyloidogenic pathway in Alzheimer's disease. Here, we asked whether BACE1mediated cleavage of β4subunit has an impact on I NaR and, consequently, on the firing properties of Purkinje cells. In cerebellar tissue of BACE1−/− mice, mRNA levels of Na + channel αsubunits 1.1, 1.2, and 1.6 and of βsubunits 1–4 remained unchanged, but processing of β4 peptide was profoundly altered. Patchclamp recordings from acutely isolated Purkinje cells of BACE1−/− and WT mice did not reveal any differences in steadystate properties and in current densities of transient, persistent, and resurgent Na + currents. However, I NaR was found to decay significantly faster in BACE1deficient Purkinje cells than in WT cells. In modeling studies, the altered time course of I NaR decay could be replicated when we decreased the efficiency of openchannel block. In currentclamp recordings, BACE1−/− Purkinje cells displayed lower spontaneous firing rate than normal cells. Computer simulations supported the hypothesis that the accelerated decay kinetics of I NaR are responsible for the slower firing rate. Our study elucidates a novel function of BACE1 in the regulation of neuronal excitability that serves to tune the firing pattern of Purkinje cells and presumably other neurons endowed with I NaR . Abstract The role of cortical feedback in the thalamocortical processing loop has been extensively investigated over the last decades. With an exception of several cases, these searches focused on the cortical feedback exerted onto thalamocortical relay (TC) cells of the dorsal lateral geniculate nucleus (LGN). In a previous, physiological study, we showed in the cat visual system that cessation of cortical input, despite decrease of spontaneous activity of TC cells, increased spontaneous firing of their recurrent inhibitory interneurons located in the perigeniculate nucleus (PGN). To identify mechanisms underlying such functional changes we conducted a modeling study in NEURON on several networks of point neurons with varied model parameters, such as membrane properties, synaptic weights and axonal delays. We considered six network topologies of the retinogeniculocortical system. All models were robust against changes of axonal delays except for the delay between the LGN feedforward interneuron and the TC cell. The best representation of physiological results was obtained with models containing reciprocally connected PGN cells driven by the cortex and with relatively slow decay of intracellular calcium. This strongly indicates that the thalamic reticular nucleus plays an essential role in the cortical influence over thalamocortical relay cells while the thalamic feedforward interneurons are not essential in this process. Further, we suggest that the dependence of the activity of PGN cells on the rate of calcium removal can be one of the key factors determining individual cell response to elimination of cortical input. Abstract The nucleus accumbens (NAc), a critical structure of the brain reward circuit, is implicated in normal goaldirected behaviour and learning as well as pathological conditions like schizophrenia and addiction. Its major cellular substrates, the medium spiny (MS) neurons, possess a wide variety of dendritic active conductances that may modulate the excitatory post synaptic potentials (EPSPs) and cell excitability. We examine this issue using a biophysically detailed 189compartment stylized model of the NAc MS neuron, incorporating all the known active conductances. We find that, of all the active channels, inward rectifying K + (K IR ) channels play the primary role in modulating the resting membrane potential (RMP) and EPSPs in the downstate of the neuron. Reduction in the conductance of K IR channels evokes facilitatory effects on EPSPs accompanied by rises in local input resistance and membrane time constant. At depolarized membrane potentials closer to upstate levels, the slowly inactivating Atype potassium channel (K As ) conductance also plays a strong role in determining synaptic potential parameters and cell excitability. We discuss the implications of our results for the regulation of accumbal MS neuron biophysics and synaptic integration by intrinsic factors and extrinsic agents such as dopamine. Abstract The computerassisted threedimensional reconstruction of neuronal morphology is becoming an increasingly popular technique to quantify the arborization patterns of dendrites and axons. The resulting digital files are suitable for comprehensive morphometric analyses as well as for building anatomically realistic compartmental models of membrane biophysics and neuronal electrophysiology. The digital tracings acquired in a lab for a specific purpose can be often reused by a different research group to address a completely unrelated scientific question, if the original investigators are willing to share the data. Since reconstructing neuronal morphology is a laborintensive process, data sharing and reanalysis is particularly advantageous for the neuroscience and biomedical communities. Here we present numerous cases of “success stories” in which digital reconstructions of neuronal morphology were shared and reused, leading to additional, independent discoveries and publications, and thus amplifying the impact of the “source” study for which the data set was first collected. In particular, we overview four main applications of this kind of data: comparative morphometric analyses, statistical estimation of potential synaptic connectivity, morphologically accurate electrophysiological simulations, and computational models of neuronal shape and development. Abstract The chapter describes a novel computational approach to modeling the cortex dynamics that integrates gene–protein regulatory networks with a neural network model. Interaction of genes and proteins in neurons affects the dynamics of the whole neural network. We have adopted an exploratory approach of investigating many randomly generated gene regulatory matrices out of which we kept those that generated interesting dynamics. This naïve brute force approach served us to explore the potential application of computational neurogenetic models in relation to gene knockout neurogenetics experiments. The knock out of a hypothetical gene for fast inhibition in our artificial genome has led to an interesting neural activity. In spite of the fact that the artificial gene/protein network has been altered due to one gene knock out, the dynamics computational neurogenetic modeling dynamics of SNN in terms of spiking activity was most of the time very similar to the result obtained with the complete gene/protein network. However, from time to time the neurons spontaneously temporarily synchronized their spiking into coherent global oscillations. In our model, the fluctuations in the values of neuronal parameters leads to spontaneous development of seizurelike global synchronizations. seizurelike These very same fluctuations also lead to termination of the seizurelike neural activity and maintenance of the interictal normal periods of activity. Based on our model, we would like to suggest a hypothesis that parameter changes due to the gene–protein dynamics should also be included as a serious factor determining transitions in neural dynamics, especially when the cause of disease is known to be genetic. Abstract The local field potential (LFP) is among the most important experimental measures when probing neural population activity, but a proper understanding of the link between the underlying neural activity and the LFP signal is still missing. Here we investigate this link by mathematical modeling of contributions to the LFP from a single layer5 pyramidal neuron and a single layer4 stellate neuron receiving synaptic input. An intrinsic dendritic lowpass filtering effect of the LFP signal, previously demonstrated for extracellular signatures of action potentials, is seen to strongly affect the LFP power spectra, even for frequencies as low as 10 Hz for the example pyramidal neuron. Further, the LFP signal is found to depend sensitively on both the recording position and the position of the synaptic input: the LFP power spectra recorded close to the active synapse are typically found to be less lowpass filtered than spectra recorded further away. Some recording positions display striking bandpass characteristics of the LFP. The frequency dependence of the properties of the current dipole moment set up by the synaptic input current is found to qualitatively account for several salient features of the observed LFP. Two approximate schemes for calculating the LFP, the dipole approximation and the twomonopole approximation, are tested and found to be potentially useful for translating results from largescale neural network models into predictions for results from electroencephalographic (EEG) or electrocorticographic (ECoG) recordings. Abstract Dopaminergic (DA) neurons of the mammalian midbrain exhibit unusually low firing frequencies in vitro . Furthermore, injection of depolarizing current induces depolarization block before high frequencies are achieved. The maximum steady and transient rates are about 10 and 20 Hz, respectively, despite the ability of these neurons to generate bursts at higher frequencies in vivo . We use a threecompartment model calibrated to reproduce DA neuron responses to several pharmacological manipulations to uncover mechanisms of frequency limitation. The model exhibits a slow oscillatory potential (SOP) dependent on the interplay between the Ltype Ca 2+ current and the small conductance K + (SK) current that is unmasked by fast Na + current block. Contrary to previous theoretical work, the SOP does not pace the steady spiking frequency in our model. The main currents that determine the spontaneous firing frequency are the subthreshold Ltype Ca 2+ and the Atype K + currents. The model identifies the channel densities for the fast Na + and the delayed rectifier K + currents as critical parameters limiting the maximal steady frequency evoked by a depolarizing pulse. We hypothesize that the low maximal steady frequencies result from a low safety factor for action potential generation. In the model, the rate of Ca 2+ accumulation in the distal dendrites controls the transient initial frequency in response to a depolarizing pulse. Similar results are obtained when the same model parameters are used in a multicompartmental model with a realistic reconstructed morphology, indicating that the salient contributions of the dendritic architecture have been captured by the simpler model. Abstract Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and stateoftheart Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semanticallyequivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. Abstract Injury to neural tissue renders voltagegated Na + (Nav) channels leaky. Even mild axonal trauma initiates Na + loading, leading to secondary Ca 2+ loading and white matter degeneration. The nodal isoform is Nav1.6 and for Nav1.6expressing HEKcells, traumatic whole cell stretch causes an immediate tetrodotoxinsensitive Na + leak. In stretchdamaged oocyte patches, Nav1.6 current undergoes damageintensity dependent hyperpolarizing (left) shifts, but whether leftshift underlies injuredaxon Navleak is uncertain. Nav1.6 inactivation (availability) is kinetically limited by (coupled to) Nav activation, yielding coupled leftshift (CLS) of the two processes: CLS should move the steadystate Nav1.6 “window conductance” closer to typical firing thresholds. Here we simulated excitability and ion homeostasis in freerunning nodes of Ranvier to assess if hallmark injuredaxon behaviors—Na + loading, ectopic excitation, propagation block—would occur with NavCLS. Intact/traumatized axolemma ratios were varied, and for some simulations Na/K pumps were included, with varied in/outside volumes. We simulated saltatory propagation with one midaxon node variously traumatized. While dissipating the [Na + ] gradient and hyperactivating the Na/K pump, NavCLS generated neuropathic painlike ectopic bursts. Depending on CLS magnitude, fraction of Nav channels affected, and pump intensity, tonic or burst firing or nodal inexcitability occurred, with [Na + ] and [K + ] fluctuating. Severe CLSinduced inexcitability did not preclude Na + loading; in fact, the steadystate Na + leaks elicited large pump currents. At a midaxon node, mild CLS perturbed normal anterograde propagation, and severe CLS blocked saltatory propagation. These results suggest that in damaged excitable cells, NavCLS could initiate cellular deterioration with attendant hyper or hypoexcitability. Healthycell versions of NavCLS, however, could contribute to physiological rhythmic firing. Abstract Lateral inhibition of cells surrounding an excited area is a key property of sensory systems, sharpening the preferential tuning of individual cells in the presence of closely related input signals. In the olfactory pathway, a dendrodendritic synaptic microcircuit between mitral and granule cells in the olfactory bulb has been proposed to mediate this type of interaction through granule cell inhibition of surrounding mitral cells. However, it is becoming evident that odor inputs result in broad activation of the olfactory bulb with interactions that go beyond neighboring cells. Using a realistic modeling approach we show how backpropagating action potentials in the long lateral dendrites of mitral cells, together with granule cell actions on mitral cells within narrow columns forming glomerular units, can provide a mechanism to activate strong local inhibition between arbitrarily distant mitral cells. The simulations predict a new role for the dendrodendritic synapses in the multicolumnar organization of the granule cells. This new paradigm gives insight into the functional significance of the patterns of connectivity revealed by recent viral tracing studies. Together they suggest a functional wiring of the olfactory bulb that could greatly expand the computational roles of the mitral–granule cell network. Abstract Spinal motor neurons have voltage gated ion channels localized in their dendrites that generate plateau potentials. The physical separation of ion channels for spiking from plateau generating channels can result in nonlinear bistable firing patterns. The physical separation and geometry of the dendrites results in asymmetric coupling between dendrites and soma that has not been addressed in reduced models of nonlinear phenomena in motor neurons. We measured voltage attenuation properties of six anatomically reconstructed and typeidentified cat spinal motor neurons to characterize asymmetric coupling between the dendrites and soma. We showed that the voltage attenuation at any distance from the soma was directiondependent and could be described as a function of the input resistance at the soma. An analytical solution for the lumped cable parameters in a twocompartment model was derived based on this finding. This is the first twocompartment modeling approach that directly derived lumped cable parameters from the geometrical and passive electrical properties of anatomically reconstructed neurons. Abstract Models for temporary information storage in neuronal populations are dominated by mechanisms directly dependent on synaptic plasticity. There are nevertheless other mechanisms available that are well suited for creating shortterm memories. Here we present a model for working memory which relies on the modulation of the intrinsic excitability properties of neurons, instead of synaptic plasticity, to retain novel information for periods of seconds to minutes. We show that it is possible to effectively use this mechanism to store the serial order in a sequence of patterns of activity. For this we introduce a functional class of neurons, named gate interneurons, which can store information in their membrane dynamics and can literally act as gates routing the flow of activations in the principal neurons population. The presented model exhibits properties which are in close agreement with experimental results in working memory. Namely, the recall process plays an important role in stabilizing and prolonging the memory trace. This means that the stored information is correctly maintained as long as it is being used. Moreover, the working memory model is adequate for storing completely new information, in time windows compatible with the notion of “oneshot” learning (hundreds of milliseconds). Abstract For the analysis of neuronal cooperativity, simultaneously recorded extracellular signals from neighboring neurons need to be sorted reliably by a spike sorting method. Many algorithms have been developed to this end, however, to date, none of them manages to fulfill a set of demanding requirements. In particular, it is desirable to have an algorithm that operates online, detects and classifies overlapping spikes in real time, and that adapts to nonstationary data. Here, we present a combined spike detection and classification algorithm, which explicitly addresses these issues. Our approach makes use of linear filters to find a new representation of the data and to optimally enhance the signaltonoise ratio. We introduce a method called “Deconfusion” which decorrelates the filter outputs and provides source separation. Finally, a set of welldefined thresholds is applied and leads to simultaneous spike detection and spike classification. By incorporating a direct feedback, the algorithm adapts to nonstationary data and is, therefore, well suited for acute recordings. We evaluate our method on simulated and experimental data, including simultaneous intra/extracellular recordings made in slices of a rat cortex and recordings from the prefrontal cortex of awake behaving macaques. We compare the results to existing spike detection as well as spike sorting methods. We conclude that our algorithm meets all of the mentioned requirements and outperforms other methods under realistic signaltonoise ratios and in the presence of overlapping spikes. Abstract Avian nucleus isthmi pars parvocellularis (Ipc) neurons are reciprocally connected with the layer 10 (L10) neurons in the optic tectum and respond with oscillatory bursts to visual stimulation. Our in vitro experiments show that both neuron types respond with regular spiking to somatic current injection and that the feedforward and feedback synaptic connections are excitatory, but of different strength and time course. To elucidate mechanisms of oscillatory bursting in this network of regularly spiking neurons, we investigated an experimentally constrained model of coupled leaky integrateandfire neurons with spikerate adaptation. The model reproduces the observed Ipc oscillatory bursting in response to simulated visual stimulation. A scan through the model parameter volume reveals that Ipc oscillatory burst generation can be caused by strong and brief feedforward synaptic conductance changes. The mechanism is sensitive to the parameter values of spikerate adaptation. In conclusion, we show that a network of regularspiking neurons with feedforward excitation and spikerate adaptation can generate oscillatory bursting in response to a constant input. Abstract Electrical stimulation of the central nervous system creates both orthodromically propagating action potentials, by stimulation of local cells and passing axons, and antidromically propagating action potentials, by stimulation of presynaptic axons and terminals. Our aim was to understand how antidromic action potentials navigate through complex arborizations, such as those of thalamic and basal ganglia afferents—sites of electrical activation during deep brain stimulation. We developed computational models to study the propagation of antidromic action potentials past the bifurcation in branched axons. In both unmyelinated and myelinated branched axons, when the diameters of each axon branch remained under a specific threshold (set by the antidromic geometric ratio), antidromic propagation occurred robustly; action potentials traveled both antidromically into the primary segment as well as “reorthodromically” into the terminal secondary segment. Propagation occurred across a broad range of stimulation frequencies, axon segment geometries, and concentrations of extracellular potassium, but was strongly dependent on the geometry of the node of Ranvier at the axonal bifurcation. Thus, antidromic activation of axon terminals can, through axon collaterals, lead to widespread activation or inhibition of targets remote from the site of stimulation. These effects should be included when interpreting the results of functional imaging or evoked potential studies on the mechanisms of action of DBS. Abstract The response of an oscillator to perturbations is described by its phaseresponse curve (PRC), which is related to the type of bifurcation leading from rest to tonic spiking. In a recent experimental study, we have shown that the type of PRC in cortical pyramidal neurons can be switched by cholinergic neuromodulation from type II (biphasic) to type I (monophasic). We explored how intrinsic mechanisms affected by acetylcholine influence the PRC using three different types of neuronal models: a theta neuron, singlecompartment neurons and a multicompartment neuron. In all of these models a decrease in the amount of a spikefrequency adaptation current was a necessary and sufficient condition for the shape of the PRC to change from biphasic (type II) to purely positive (type I). Abstract Small conductance (SK) calciumactivated potassium channels are found in many tissues throughout the body and open in response to elevations in intracellular calcium. In hippocampal neurons, SK channels are spatially colocalized with LType calcium channels. Due to the restriction of calcium transients into microdomains, only a limited number of LType Ca 2+ channels can activate SK and, thus, stochastic gating becomes relevant. Using a stochastic model with calcium microdomains, we predict that intracellular Ca 2+ fluctuations resulting from Ca 2+ channel gating can increase SK2 subthreshold activity by 1–2 orders of magnitude. This effectively reduces the value of the Hill coefficient. To explain the underlying mechanism, we show how short, highamplitude calcium pulses associated with stochastic gating of calcium channels are much more effective at activating SK2 channels than the steady calcium signal produced by a deterministic simulation. This stochastic amplification results from two factors: first, a supralinear rise in the SK2 channel’s steadystate activation curve at low calcium levels and, second, a momentary reduction in the channel’s time constant during the calcium pulse, causing the channel to approach its steadystate activation value much faster than it decays. Stochastic amplification can potentially explain subthreshold SK2 activation in unified models of both sub and suprathreshold regimes. Furthermore, we expect it to be a general phenomenon relevant to many proteins that are activated nonlinearly by stochastic ligand release. Abstract A tonicclonic seizure transitions from high frequency asynchronous activity to low frequency coherent oscillations, yet the mechanism of transition remains unknown. We propose a shift in network synchrony due to changes in cellular response. Here we use phaseresponse curves (PRC) from MorrisLecar (ML) model neurons with synaptic depression and gradually decrease input current to cells within a network simulation. This method effectively decreases firing rates resulting in a shift to greater network synchrony illustrating a possible mechanism of the transition phenomenon. PRCs are measured from the ML conductance based model cell with a range of input currents within the limit cycle. A large network of 3000 excitatory neurons is simulated with a network topology generated from secondorder statistics which allows a range of population synchrony. The population synchrony of the oscillating cells is measured with the Kuramoto order parameter, which reveals a transition from tonic to clonic phase exhibited by our model network. The cellular response shift mechanism for the tonicclonic seizure transition reproduces the population behavior closely when compared to EEG data. Abstract We have built a phenomenological spiking model of the cat early visual system comprising the retina, the Lateral Geniculate Nucleus (LGN) and V1’s layer 4, and established four main results (1) When exposed to videos that reproduce with high fidelity what a cat experiences under natural conditions, adjacent Retinal Ganglion Cells (RGCs) have spiketime correlations at a short timescale (~30 ms), despite neuronal noise and possible jitter accumulation. (2) In accordance with recent experimental findings, the LGN filters out some noise. It thus increases the spike reliability and temporal precision, the sparsity, and, importantly, further decreases down to ~15 ms adjacent cells’ correlation timescale. (3) Downstream simple cells in V1’s layer 4, if equipped with Spike TimingDependent Plasticity (STDP), may detect these finescale crosscorrelations, and thus connect principally to ON and OFFcentre cells with Receptive Fields (RF) aligned in the visual space, and thereby become orientation selective, in accordance with Hubel and Wiesel (Journal of Physiology 160:106–154, 1962 ) classic model. Up to this point we dealt with continuous vision, and there was no absolute time reference such as a stimulus onset, yet information was encoded and decoded in the relative spike times. (4) We then simulated saccades to a static image and benchmarked relative spike time coding and timetofirst spike coding w.r.t. to saccade landing in the context of orientation representation. In both the retina and the LGN, relative spike times are more precise, less affected by prelanding history and global contrast than absolute ones, and lead to robust contrast invariant orientation representations in V1. Abstract The activity patterns of the globus pallidus (GPe) and subthalamic nucleus (STN) are closely associated with motor function and dysfunction in the basal ganglia. In the pathological state caused by dopamine depletion, the STN–GPe network exhibits rhythmic synchronous activity accompanied by rebound bursts in the STN. Therefore, the mechanism of activity transition is a key to understand basal ganglia functions. As synchronization in GPe neurons could induce pathological STN rebound bursts, it is important to study how synchrony is generated in the GPe. To clarify this issue, we applied the phasereduction technique to a conductancebased GPe neuronal model in order to derive the phase response curve (PRC) and interaction function between coupled GPe neurons. Using the PRC and interaction function, we studied how the steadystate activity of the GPe network depends on intrinsic membrane properties, varying ionic conductances on the membrane. We noted that a change in persistent sodium current, fast delayed rectifier Kv3 potassium current, Mtype potassium current and small conductance calciumdependent potassium current influenced the PRC shape and the steady state. The effect of those currents on the PRC shape could be attributed to extension of the firing period and reduction of the phase response immediately after an action potential. In particular, the slow potassium current arising from the Mtype potassium and the SK current was responsible for the reduction of the phase response. These results suggest that the membrane property modulation controls synchronization/asynchronization in the GPe and the pathological pattern of STN–GPe activity. Abstract The receptive fields of cells in the lateral geniculate nucleus (LGN) are shaped by their diverse set of impinging inputs: feedforward synaptic inputs stemming from retina, and feedback inputs stemming from the visual cortex and the thalamic reticular nucleus. To probe the possible roles of these feedforward and feedback inputs in shaping the temporal receptivefield structure of LGN relay cells, we here present and investigate a minimal mechanistic firingrate model tailored to elucidate their disparate features. The model for LGN relay ON cells includes feedforward excitation and inhibition (via interneurons) from retinal ON cells and excitatory and inhibitory (via thalamic reticular nucleus cells and interneurons) feedback from cortical ON and OFF cells. From a general firingrate model formulated in terms of Volterra integral equations, we derive a single delay differential equation with absolute delay governing the dynamics of the system. A freely available and easytouse GUIbased MATLAB version of this minimal mechanistic LGN circuit model is provided. We particularly investigate the LGN relaycell impulse response and find through thorough explorations of the model’s parameter space that both purely feedforward models and feedback models with feedforward excitation only, can account quantitatively for previously reported experimental results. We find, however, that the purely feedforward model predicts two impulse response measures, the time to first peak and the biphasic index (measuring the relative weight of the rebound phase) to be anticorrelated. In contrast, the models with feedback predict different correlations between these two measures. This suggests an experimental test assessing the relative importance of feedforward and feedback connections in shaping the impulse response of LGN relay cells. Integration of biochemical and electrical signaling-multiscale model of the medium spiny neuron of the striatum. PloS one Neuron behavior results from the interplay between networks of biochemical processes and electrical signaling. Synaptic plasticity is one of the neuronal properties emerging from such an interaction. One of the current approaches to study plasticity is to model either its electrical aspects or its biochemical components. Among the chief reasons are the different time scales involved, electrical events happening in milliseconds while biochemical cascades respond in minutes or hours. In order to create multiscale models taking in consideration both aspects simultaneously, one needs to synchronize the two models, and exchange relevant variable values. We present a new event-driven algorithm to synchronize different neuronal models, which decreases computational time and avoids superfluous synchronizations. The algorithm is implemented in the TimeScales framework. We demonstrate its use by simulating a new multiscale model of the Medium Spiny Neuron of the Neostriatum. The model comprises over a thousand dendritic spines, where the electrical model interacts with the respective instances of a biochemical model. Our results show that a multiscale model is able to exhibit changes of synaptic plasticity as a result of the interaction between electrical and biochemical signaling. Our synchronization strategy is general enough to be used in simulations of other models with similar synchronization issues, such as networks of neurons. Moreover, the integration between the electrical and the biochemical models opens up the possibility to investigate multiscale process, like synaptic plasticity, in a more global manner, while taking into account a more realistic description of the underlying mechanisms. Action Potentials;Algorithms;Computer Simulation;Corpus Striatum;Dendritic Spines;Electric Stimulation;Models, Neurological;Neuronal Plasticity;Neurons;Phosphorylation;Receptors, Glutamate;Signal Transduction;Synapses A machine learning method for the prediction of receptor activation in the simulation of synapses. PloS one Chemical synaptic transmission involves the release of a neurotransmitter that diffuses in the extracellular space and interacts with specific receptors located on the postsynaptic membrane. Computer simulation approaches provide fundamental tools for exploring various aspects of the synaptic transmission under different conditions. In particular, Monte Carlo methods can track the stochastic movements of neurotransmitter molecules and their interactions with other discrete molecules, the receptors. However, these methods are computationally expensive, even when used with simplified models, preventing their use in large-scale and multi-scale simulations of complex neuronal systems that may involve large numbers of synaptic connections. We have developed a machine-learning based method that can accurately predict relevant aspects of the behavior of synapses, such as the percentage of open synaptic receptors as a function of time since the release of the neurotransmitter, with considerably lower computational cost compared with the conventional Monte Carlo alternative. The method is designed to learn patterns and general principles from a corpus of previously generated Monte Carlo simulations of synapses covering a wide range of structural and functional characteristics. These patterns are later used as a predictive model of the behavior of synapses under different conditions without the need for additional computationally expensive Monte Carlo simulations. This is performed in five stages: data sampling, fold creation, machine learning, validation and curve fitting. The resulting procedure is accurate, automatic, and it is general enough to predict synapse behavior under experimental conditions that are different to the ones it has been trained on. Since our method efficiently reproduces the results that can be obtained with Monte Carlo simulations at a considerably lower computational cost, it is suitable for the simulation of high numbers of synapses and it is therefore an excellent tool for multi-scale simulations. Aged;Aged, 80 and over;Alzheimer Disease;Brain Mapping;Case-Control Studies;Cell Physiological Phenomena;Female;Fluorodeoxyglucose F18;Follow-Up Studies;Glucose;Humans;Longitudinal Studies;Male;Middle Aged;Mild Cognitive Impairment;Neural Networks (Computer);Neuroimaging;Positron-Emission Tomography Entrez Gene: gene-centered information at NCBI. Nucleic acids research The capabilities and limitations of conductance-based compartmental neuron models with reduced branched or unbranched morphologies and active dendrites Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Rapid desynchronization of an electrically coupled interneuron network with sparse excitatory synaptic input. Neuron Electrical synapses between interneurons contribute to synchronized firing and network oscillations in the brain. However, little is known about how such networks respond to excitatory synaptic input. To investigate this, we studied electrically coupled Golgi cells (GoC) in the cerebellar input layer. We show with immunohistochemistry, electron microscopy, and electrophysiology that Connexin-36 is necessary for functional gap junctions (GJs) between GoC dendrites. In the absence of coincident synaptic input, GoCs synchronize their firing. In contrast, sparse, coincident mossy fiber input triggered a mixture of excitation and inhibition of GoC firing and spike desynchronization. Inhibition is caused by propagation of the spike afterhyperpolarization through GJs. This triggers network desynchronization because heterogeneous coupling to surrounding cells causes spike-phase dispersion. Detailed network models predict that desynchronization is robust, local, and dependent on synaptic input properties. Our results show that GJ coupling can be inhibitory and either promote network synchronization or trigger rapid network desynchronization depending on the synaptic input. Animals;Animals, Newborn;Cerebellar Cortex;Cortical Synchronization;Excitatory Postsynaptic Potentials;Interneurons;Male;Mice;Nerve Net;Rats;Rats, Wistar;Synapses;Time Factors Abnormal Excitability of Oblique Dendrites Implicated in Early Alzheimer's: A Computational Study. Frontiers in neural circuits The integrative properties of cortical pyramidal dendrites are essential to the neural basis of cognitive function, but the impact of amyloid beta protein (abeta) on these properties in early Alzheimer's is poorly understood. In animal models, electrophysiological studies of proximal dendrites have shown that abeta induces hyperexcitability by blocking A-type K+ currents (I(A)), disrupting signal integration. The present study uses a computational approach to analyze the hyperexcitability induced in distal dendrites beyond the experimental recording sites. The results show that back-propagating action potentials in the dendrites induce hyperexcitability and excessive calcium concentrations not only in the main apical trunk of pyramidal cell dendrites, but also in their oblique dendrites. Evidence is provided that these thin branches are particularly sensitive to local reductions in I(A). The results suggest the hypothesis that the oblique branches may be most vulnerable to disruptions of I(A) by early exposure to abeta, and point the way to further experimental analysis of these actions as factors in the neural basis of the early decline of cognitive function in Alzheimer's. A computer model of unitary responses from associational/commissural and perforant path synapses in hippocampal CA3 pyramidal cells Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. The layer-oriented approach to declarative languages for biological modeling. PLoS computational biology We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language. Animals;Computational Biology;Computer Simulation;Humans;Models, Biological;Natural Language Processing;Programming Languages;Terminology as Topic ModelDB: an environment for running and storing computational models and their results applied to neuroscience. Journal of the American Medical Informatics Association : JAMIA Research groups within the Human Brain Project are developing technologies to help organize and make accessible the vast quantities of information being accumulated in the neurosciences. The goal of this work is to provide systems that enable this complex information from many diverse sources to be synthesized into a coherent theory of nervous system function. Our initial approach to this problem has been to create several small databases. While addressing the issues of each individual database, we are also considering how each might be incorporated into an integrated cluster of databases. In this paper, we describe a pilot project in which we construct a database of computational models of neuronal function. This database allows models to be created and run and their results reviewed through a World Wide Web interface. Because models encapsulate knowledge in a formal manner about how neuronal systems function, we also discuss how this database forms a natural center for our initial attempts at creating a cluster of related databases. General issues of database development in the context of the Web are also discussed. Brain;Computer Communication Networks;Computer Simulation;Databases, Factual;Humans;Interinstitutional Relations;Models, Biological;Neurosciences;Pilot Projects;Research;Systems Integration;User-Computer Interface A systematic comparison of the MetaCyc and KEGG pathway databases. BMC bioinformatics Toward the prediction of class I and II mouse major histocompatibility complex-peptide-binding affinity: in silico bioinformatic step-by-step guide using quantitative structure-activity relationships. Methods in molecular biology (Clifton, N.J.) The contrasting properties of conservation and correlated phylogeny in protein functional residue prediction. BMC bioinformatics A resource for benchmarking the usefulness of protein structure models. BMC bioinformatics BACKGROUND: Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the three-dimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. RESULTS: This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. CONCLUSIONS: The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Pre-computed data for a non-redundant set of deposited protein structures are available for analysis and download in the ModelDB database. IMPLEMENTATION, AVAILABILITY AND REQUIREMENTS: Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODEL-DB/MODEL-DB_web/MODindex.php.Operating system(s): Platform independent. Programming language: Perl-BioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by non-academics: No. Benchmarking;Databases, Protein;Models, Molecular;Protein Conformation;Software PaxDb, a database of protein abundance averages across all three domains of life. Molecular & cellular proteomics : MCP Alternative metrics for noise exposure among construction workers. The Annals of occupational hygiene Impact of Transcription Units rearrangement on the evolution of the regulatory network of gamma-proteobacteria. BMC genomics Quantitative Ultrasound and bone's response to exercise: a meta analysis. Bone Prediction of the intelligibility for speech in real-life background noises for subjects with normal hearing. Ear and hearing Complementing computationally predicted regulatory sites in Tractor_DB using a pattern matching approach. In silico biology Making health technology assessment information available for decision making: the development of a Thai database. Journal of the Medical Association of Thailand = Chotmaihet thangphaet SenseLab: new developments in disseminating neuroscience information. Briefings in bioinformatics This article presents the latest developments in neuroscience information dissemination through the SenseLab suite of databases: NeuronDB, CellPropDB, ORDB, OdorDB, OdorMapDB, ModelDB and BrainPharm. These databases include information related to: (i) neuronal membrane properties and neuronal models, and (ii) genetics, genomics, proteomics and imaging studies of the olfactory system. We describe here: the new features for each database, the evolution of SenseLab's unifying database architecture and instances of SenseLab database interoperation with other neuroscience online resources. Databases, Factual;Humans;Information Dissemination;Information Storage and Retrieval;Internet;Neurosciences;Software;Systems Integration Cytoplasmic electric fields and electroosmosis: possible solution for the paradoxes of the intracellular transport of biomolecules. PloS one The objective of the paper is to show that electroosmotic flow might play an important role in the intracellular transport of biomolecules. The paper presents two mathematical models describing the role of electroosmosis in the transport of the negatively charged messenger proteins to the negatively charged nucleus and in the recovery of the fluorescence after photobleaching. The parameters of the models were derived from the extensive review of the literature data. Computer simulations were performed within the COMSOL 4.2a software environment. The first model demonstrated that the presence of electroosmosis might intensify the flux of messenger proteins to the nucleus and allow the efficient transport of the negatively charged phosphorylated messenger proteins against the electrostatic repulsion of the negatively charged nucleus. The second model revealed that the presence of the electroosmotic flow made the time of fluorescence recovery dependent on the position of the bleaching spot relative to cellular membrane. The magnitude of the electroosmotic flow effect was shown to be quite substantial, i.e. increasing the flux of the messengers onto the nucleus up to 4-fold relative to pure diffusion and resulting in the up to 3-fold change in the values of fluorescence recovery time, and therefore the apparent diffusion coefficient determined from the fluorescence recovery after photobleaching experiments. Based on the results of the modeling and on the universal nature of the electroosmotic flow, the potential wider implications of electroosmotic flow in the intracellular and extracellular biological processes are discussed. Both models are available for download at ModelDB. Animals;Biological Transport;Cytoplasm;Electroosmosis;Humans;Models, Theoretical Reassembly and interfacing neural models registered on biological model databases. Genome informatics. International Conference on Genome Informatics The importance of modeling and simulation of biological process is growing for further understanding of living systems at all scales from molecular to cellular, organic, and individuals. In the field of neuroscience, there are so called platform simulators, the de-facto standard neural simulators. More than a hundred neural models are registered on the model database. These models are executable in corresponding simulation environments. But usability of the registered models is not sufficient. In order to make use of the model, the users have to identify the input, output and internal state variables and parameters of the models. The roles and units of each variable and parameter are not explicitly defined in the model files. These are suggested implicitly in the papers where the simulation results are demonstrated. In this study, we propose a novel method of reassembly and interfacing models registered on biological model database. The method was applied to the neural models registered on one of the typical biological model database, ModelDB. The results are described in detail with the hippocampal pyramidal neuron model. The model is executable in NEURON simulator environment, which demonstrates that somatic EPSP amplitude is independent of synapse location. Input and output parameters and variables were identified successfully, and the results of the simulation were recorded in the organized form with annotations. Computational Biology;Computer Simulation;Databases, Factual;Hippocampus;Models, Neurological Semi-automated population of an online database of neuronal models (ModelDB) with citation information, using PubMed for validation. Neuroinformatics Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Databases, Bibliographic;Models, Neurological;Online Systems;Periodicals as Topic;PubMed;Publishing The NIF LinkOut Broker: A Web Resource to Facilitate Federated Data Integration using NCBI Identifiers Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. A spatially extended model for macroscopic spike-wave discharges Journal of Computational Neuroscience Summary This chapter constitutes miniproceedings of the Workshop on Physiology Databases and Analysis Software that was a part of the Annual Computational Neuroscience Meeting CNS*2007 that took place in July 2007 in Toronto, Canada (http ://www.cnsorg.org). The main aim of the workshop was to bring together researchers interested in developing and using automated analysis tools and database systems for electrophysiological data. Selected discussed topics, including the review of some current and potential applications of Computational Intelligence (CI) in electrophysiology, database and electrophysiological data exchange platforms, languages, and formats, as well as exemplary analysis problems, are presented in this chapter. The authors hope that the chapter will be useful not only to those already involved in the field of electrophysiology, but also to CI researchers, whose interest will be sparked by its contents. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. We seek to study these dynamics with respect to the following compartments: neurons, glia, and extracellular space. We are particularly interested in the slower timescale dynamics that determine overall excitability, and set the stage for transient episodes of persistent oscillations, working memory, or seizures. In this second of two companion papers, we present an ionic current network model composed of populations of Hodgkin–Huxley type excitatory and inhibitory neurons embedded within extracellular space and glia, in order to investigate the role of microenvironmental ionic dynamics on the stability of persistent activity. We show that these networks reproduce seizurelike activity if glial cells fail to maintain the proper microenvironmental conditions surrounding neurons, and produce several experimentally testable predictions. Our work suggests that the stability of persistent states to perturbation is set by glial activity, and that how the response to such perturbations decays or grows may be a critical factor in a variety of disparate transient phenomena such as working memory, burst firing in neonatal brain or spinal cord, up states, seizures, and cortical oscillations. Abstract The spatial variation of the extracellular action potentials (EAP) of a single neuron contains information about the size and location of the dominant current source of its action potential generator, which is typically in the vicinity of the soma. Using this dependence in reverse in a threecomponent realistic probe + brain + source model, we solved the inverse problem of characterizing the equivalent current source of an isolated neuron from the EAP data sampled by an extracellular probe at multiple independent recording locations. We used a dipole for the model source because there is extensive evidence it accurately captures the spatial rolloff of the EAP amplitude, and because, as we show, dipole localization, beyond a minimum cellprobe distance, is a more accurate alternative to approaches based on monopole source models. Dipole characterization is separable into a linear dipole moment optimization where the dipole location is fixed, and a second, nonlinear, global optimization of the source location. We solved the linear optimization on a discrete grid via the lead fields of the probe, which can be calculated for any realistic probe + brain model by the finite element method. The global source location was optimized by means of Tikhonov regularization that jointly minimizes model error and dipole size. The particular strategy chosen reflects the fact that the dipole model is used in the near field, in contrast to the typical prior applications of dipole models to EKG and EEG source analysis. We applied dipole localization to data collected with stepped tetrodes whose detailed geometry was measured via scanning electron microscopy. The optimal dipole could account for 96% of the power in the spatial variation of the EAP amplitude. Among various model error contributions to the residual, we address especially the error in probe geometry, and the extent to which it biases estimates of dipole parameters. This dipole characterization method can be applied to any recording technique that has the capabilities of taking multiple independent measurements of the same single units. Abstract In these companion papers, we study how the interrelated dynamics of sodium and potassium affect the excitability of neurons, the occurrence of seizures, and the stability of persistent states of activity. In this first paper, we construct a mathematical model consisting of a single conductancebased neuron together with intra and extracellular ion concentration dynamics. We formulate a reduction of this model that permits a detailed bifurcation analysis, and show that the reduced model is a reasonable approximation of the full model. We find that competition between intrinsic neuronal currents, sodiumpotassium pumps, glia, and diffusion can produce very slow and largeamplitude oscillations in ion concentrations similar to what is seen physiologically in seizures. Using the reduced model, we identify the dynamical mechanisms that give rise to these phenomena. These models reveal several experimentally testable predictions. Our work emphasizes the critical role of ion concentration homeostasis in the proper functioning of neurons, and points to important fundamental processes that may underlie pathological states such as epilepsy. Abstract This paper introduces dyadic brain modeling – the simultaneous, computational modeling of the brains of two interacting agents – to explore ways in which our understanding of macaque brain circuitry can ground new models of brain mechanisms involved in ape interaction. Specifically, we assess a range of data on gestural communication of great apes as the basis for developing an account of the interactions of two primates engaged in ontogenetic ritualization , a proposed learning mechanism through which a functional action may become a communicative gesture over repeated interactions between two individuals (the ‘dyad’). The integration of behavioral, neural, and computational data in dyadic (or, more generally, social) brain modeling has broad application to comparative and evolutionary questions, particularly for the evolutionary origins of cognition and language in the human lineage. We relate this work to the neuroinformatics challenges of integrating and sharing data to support collaboration between primatologists, neuroscientists and modelers that will help speed the emergence of what may be called comparative neuroprimatology . Abstract The phase response curve (PRC) reflects the dynamics of the interplay between diverse intrinsic conductances that lead to spike generation. PRCs measure the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. A purely positive PRC is a signature of type I (saddlenode) dynamics while type II (subcritical Hopf dynamics) yield a biphasic PRC with both negative and positive lobes. Previous computational work hypothesized that cholinergic modulation of Mtype potassium current can switch a neuron with type II dynamics to type I dynamics. We recorded from layer 2/3 pyramidal neurons in cortical slices, and found that cholinergic action, consistent with downregulation of slow voltagedependent potassium currents such as the Mcurrent, indeed changed the PRC from type II to type I. We then explored the potential specific Kcurrentdependent mechanisms for this switch using a series of computational models. In all of these models, we show that a decrease in spikefrequency adaptation due to downregulation of the Mcurrent is associated with the switch in PRC type. Interestingly spikedependent IAHP is downregulated at lower Ach concentrations than the Mcurrent. Our simulations showed that type II nature of the PRC is amplified by low Ach level, while the PRC became type I at high Ach concentrations. We further explored the spatial aspects of Ach modulation in a compartmental model. This work suggests that cholinergic modulation of slow potassium currents may shape neuronal responding between “resonator” to “integrator.” Abstract Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Splitting cells is useful in attaining load balance in neural network simulations, especially when there is a wide range of cell sizes and the number of cells is about the same as the number of processors. For computebound simulations load balance results in almost ideal runtime scaling. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with wholecell balancing. Abstract Cardiac fibroblasts are involved in the maintenance of myocardial tissue structure. However, little is known about ion currents in human cardiac fibroblasts. It has been recently reported that cardiac fibroblasts can interact electrically with cardiomyocytes through gap junctions. Ca 2+ activated K + currents ( I K[Ca] ) of cultured human cardiac fibroblasts were characterized in this study. In wholecell configuration, depolarizing pulses evoked I K(Ca) in an outward rectification in these cells, the amplitude of which was suppressed by paxilline (1 μ M ) or iberiotoxin (200 n M ). A largeconductance, Ca 2+ activated K + (BK Ca ) channel with singlechannel conductance of 162 ± 8 pS was also observed in human cardiac fibroblasts. Western blot analysis revealed the presence of αsubunit of BK Ca channels. The dynamic LuoRudy model was applied to predict cell behavior during direct electrical coupling of cardiomyocytes and cardiac fibroblasts. In the simulation, electrically coupled cardiac fibroblasts also exhibited action potential; however, they were electrically inert with no gapjunctional coupling. The simulation predicts that changes in gap junction coupling conductance can influence the configuration of cardiac action potential and cardiomyocyte excitability. I k(Ca) can be elicited by simulated action potential waveforms of cardiac fibroblasts when they are electrically coupled to cardiomyocytes. This study demonstrates that a BK Ca channel is functionally expressed in human cardiac fibroblasts. The activity of these BK Ca channels present in human cardiac fibroblasts may contribute to the functional activities of heart cells through transfer of electrical signals between these two cell types. Abstract The large number of variables involved in many biophysical models can conceal potentially simple dynamical mechanisms governing the properties of its solutions and the transitions between them as parameters are varied. To address this issue, we extend a novel model reduction method, based on “scales of dominance,” to multicompartment models. We use this method to systematically reduce the dimension of a twocompartment conductancebased model of a crustacean pyloric dilator (PD) neuron that exhibits distinct modes of oscillation—tonic spiking, intermediate bursting and strong bursting. We divide trajectories into intervals dominated by a smaller number of variables, resulting in a locally reduced hybrid model whose dimension varies between two and six in different temporal regimes. The reduced model exhibits the same modes of oscillation as the 16 dimensional model over a comparable parameter range, and requires fewer ad hoc simplifications than a more traditional reduction to a single, globally valid model. The hybrid model highlights lowdimensional organizing structure in the dynamics of the PD neuron, and the dependence of its oscillations on parameters such as the maximal conductances of calcium currents. Our technique could be used to build hybrid lowdimensional models from any large multicompartment conductancebased model in order to analyze the interactions between different modes of activity. Abstract Background Contrast enhancement within primary stimulus representations is a common feature of sensory systems that regulates the discrimination of similar stimuli. Whereas most sensory stimulus features can be mapped onto one or two dimensions of quality or location (e.g., frequency or retinotopy), the analogous similarities among odor stimuli are distributed highdimensionally, necessarily yielding a chemotopically fragmented map upon the surface of the olfactory bulb. While olfactory contrast enhancement has been attributed to decremental lateral inhibitory processes among olfactory bulb projection neurons modeled after those in the retina, the twodimensional topology of this mechanism is intrinsically incapable of mediating effective contrast enhancement on such fragmented maps. Consequently, current theories are unable to explain the existence of olfactory contrast enhancement. Results We describe a novel neural circuit mechanism, nontopographical contrast enhancement (NTCE), which enables contrast enhancement among highdimensional odor representations exhibiting unpredictable patterns of similarity. The NTCE algorithm relies solely on local intraglomerular computations and broad feedback inhibition, and is consistent with known properties of the olfactory bulb input layer. Unlike mechanisms based upon lateral projections, NTCE does not require a builtin foreknowledge of the similarities in molecular receptive ranges expressed by different olfactory bulb glomeruli, and is independent of the physical location of glomeruli within the olfactory bulb. Conclusion Nontopographical contrast enhancement demonstrates how intrinsically highdimensional sensory data can be represented and processed within a physically twodimensional neural cortex while retaining the capacity to represent stimulus similarity. In a biophysically constrained computational model of the olfactory bulb, NTCE successfully mediates contrast enhancement among odorant representations in the natural, highdimensional similarity space defined by the olfactory receptor complement and underlies the concentrationindependence of odor quality representations. Abstract Mathematical neuronal models are normally expressed using differential equations. The ParkerSochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. The method has been limited to polynomial equations, but we present division and power operations that expand its scope. We apply the ParkerSochacki method to the Izhikevich ‘simple’ model and a HodgkinHuxley type neuron, comparing the results with those obtained using the RungeKutta and BulirschStoer methods. Benchmark simulations demonstrate an improved speed/accuracy tradeoff for the method relative to these established techniques. Abstract Background Previous onedimensional network modeling of the cerebellar granular layer has been successfully linked with a range of cerebellar cortex oscillations observed in vivo . However, the recent discovery of gap junctions between Golgi cells (GoCs), which may cause oscillations by themselves, has raised the question of how gapjunction coupling affects GoC and granularlayer oscillations. To investigate this question, we developed a novel twodimensional computational model of the GoCgranule cell (GC) circuit with and without gap junctions between GoCs. Results Isolated GoCs coupled by gap junctions had a strong tendency to generate spontaneous oscillations without affecting their mean firing frequencies in response to distributed mossy fiber input. Conversely, when GoCs were synaptically connected in the granular layer, gap junctions increased the power of the oscillations, but the oscillations were primarily driven by the synaptic feedback loop between GoCs and GCs, and the gap junctions did not change oscillation frequency or the mean firing rate of either GoCs or GCs. Conclusion Our modeling results suggest that gap junctions between GoCs increase the robustness of cerebellar cortex oscillations that are primarily driven by the feedback loop between GoCs and GCs. The robustness effect of gap junctions on synaptically driven oscillations observed in our model may be a general mechanism, also present in other regions of the brain. Abstract Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin–Huxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to twocompartment model neurons with passive dendrites. In consequence, this approach can fit semirealistic detailed model neurons in a few minutes. For validation, fits are obtained to modelderived data for various thalamocortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from nearperfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended currentbased technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration. Abstract NMDA receptors are among the crucial elements of central nervous system models. Recent studies show that both conductance and kinetics of these receptors are changing voltagedependently in some parts of the brain. Therefore, several models have been introduced to simulate their current. However, on the one hand, kinetic models—which are able to simulate these voltagedependent phenomena—are computationally expensive for modeling of large neural networks. On the other hand, classic exponential models, which are computationally less expensive, are not able to simulate the voltagedependency of these receptors, accurately. In this study, we have modified these classic models to endow them with the voltagedependent conductance and time constants. Temperature sensitivity and desensitization of these receptors are also taken into account. We show that, it is possible to simulate the most important physiological aspects of NMDA receptor’s behavior using only three to four differential equations, which is significantly smaller than the previous kinetic models. Consequently, it seems that our model is both fast and physiologically plausible and therefore is a suitable candidate for the modeling of large neural networks. Abstract Networks of synchronized fastspiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γoscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γsynchronized fastspiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γinhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an indepth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γoscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities. Abstract Background Propagation of simulated action potentials (APs) was previously studied in short single chains and in twodimensional sheets of myocardial cells 1 2 3 . The present study was undertaken to examine propagation in a long single chain of cells of various lengths, and with varying numbers of gapjunction (gj) channels, and to compare propagation velocity with the cable properties such as the length constant ( λ ). Methods and Results Simulations were carried out using the PSpice program as previously described. When the electric field (EF) mechanism was dominant (0, 1, and 10 gjchannels), the longer the chain length, the faster the overall velocity ( θ ov ). There seems to be no simple explanation for this phenomenon. In contrast, when the localcircuit current mechanism was dominant (100 gjchannels or more), θ ov was slightly slowed with lengthening of the chain. Increasing the number of gjchannels produced an increase in θ ov and caused the firing order to become more uniform. The endeffect was more pronounced at longer chain lengths and at greater number of gjchannels.When there were no or only few gjchannels (namely, 0, 10, or 30), the voltage change (ΔV m ) in the two contiguous cells (#50 & #52) to the cell injected with current (#51) was nearly zero, i.e., there was a sharp discontinuity in voltage between the adjacent cells. When there were many gjchannels (e.g., 300, 1000, 3000), there was an exponential decay of voltage on either side of the injected cell, with the length constant ( λ ) increasing at higher numbers of gjchannels. The effect of increasing the number of gjchannels on increasing λ was relatively small compared to the larger effect on θ ov . θ ov became very nonphysiological at 300 gjchannels or higher. Conclusion Thus, when there were only 0, 1, or 10 gjchannels, θ ov increased with increase in chain length, whereas at 100 gjchannels or higher, θ ov did not increase with chain length. When there were only 0, 10, or 30 gjchannels, there was a very sharp decrease in ΔV m in the two contiguous cells on either side of the injected cell, whereas at 300, 1000, or 3000 gjchannels, the voltage decay was exponential along the length of the chain. The effect of increasing the number of gjchannels on spread of current was relatively small compared to the large effect on θ ov . Abstract This article provides a demonstration of an analytical technique that can be used to investigate the causes of perceptual phenomena. The technique is based on the concept of the ideal observer, an optimal signal classifier that makes decisions that maximize the probability of a correct response. To demonstrate the technique, an analysis was conducted to investigate the role of the auditory periphery in the production of temporal masking effects. The ideal observer classified output from four models of the periphery. Since the ideal observer is the best of all possible observers, if it demonstrates masking effects, then all other observers must as well. If it does not demonstrate masking effects, then nothing about the periphery requires masking to occur, and therefore masking would occur somewhere else. The ideal observer exhibited several forward masking effects but did not exhibit backward masking, implying that the periphery has a causal role in forward but not backward masking. A general discussion of the strengths of the technique and supplementary equations are also included. Abstract Understanding the human brain and its function in INCF (International Neuroinformatics Coordinating Facility) health and disease represents one of the greatest scientific challenges of our time. In the postgenomic era, an overwhelming accumulation of new data, at all levels of exploration from DNA to human brain imaging, has been acquired. This accumulation of facts has not given rise to a corresponding increase in the understanding of integrated functions in this vast area of research involving a large number of fields extending from genetics to psychology. Neuroinformatics is uniquely placed at the intersection neuroinformatics (NI) between neuroscience and information technology, and emerges as an area of critical importance to facilitate the future conceptual development in neuroscience by creating databases which transcend different organizational database levels and allow for the development of different computational models from the subcellular to the global brain level. Abstract This paper studied the synaptic and dendritic integration with different spatial distributions of synapses on the dendrites of a biophysicallydetailed layer 5 pyramidal neuron model. It has been observed that temporally synchronous and spatially clustered synaptic inputs make dendrites perform a highly nonlinear integration. The effect of clustering degree of synaptic distribution on neuronal responsiveness is investigated by changing the number of top apical dendrites where active synapses are allocated. The neuron shows maximum responsiveness to synaptic inputs which have an intermediate clustering degree of spatial distribution, indicating complex interactions among dendrites with the existence of nonlinear synaptic and dendritic integrations. Abstract This paper describes a pilot query interface that has been constructed to help us explore a “conceptbased” approach for searching the Neuroscience Information Framework (NIF). The query interface is conceptbased in the sense that the search terms submitted through the interface are selected from a standardized vocabulary of terms (concepts) that are structured in the form of an ontology. The NIF contains three primary resources: the NIF Resource Registry, the NIF Document Archive, and the NIF Database Mediator. These NIF resources are very different in their nature and therefore pose challenges when designing a single interface from which searches can be automatically launched against all three resources simultaneously. The paper first discusses briefly several background issues involving the use of standardized biomedical vocabularies in biomedical information retrieval, and then presents a detailed example that illustrates how the pilot conceptbased query interface operates. The paper concludes by discussing certain lessons learned in the development of the current version of the interface. Abstract Simulations of orientation selectivity in visual cortex have shown that layer 4 complex cells lacking orientation tuning are ideal for providing global inhibition that scales with contrast in order to produce simple cells with contrastinvariant orientation tuning (Lauritzen and Miller in J Neurosci 23:10201–10213, 2003 ). Inhibitory cortical cells have been shown to be electrically coupled by gap junctions (Fukuda and Kosaka in J Neurosci 120:5–20, 2003 ). Such coupling promotes, among other effects, spike synchronization and coordination of postsynaptic IPSPs (Beierlein et al. in Nat Neurosci 3:904–910, 2000 ; Galarreta and Hestrin in Nat Rev Neurosci 2:425–433, 2001 ). Consequently, it was expected (Miller in Cereb Cortex 13:73–82, 2003 ) that electrical coupling would promote nonspecific functional responses consistent with the complex inhibitory cells seen in layer 4 which provide broad inhibition in response to stimuli of all orientations (Miller et al. in Curr Opin Neurobiol 11:488–497, 2001 ). This was tested using a mechanistic modeling approach. The orientation selectivity model of Lauritzen and Miller (J Neurosci 23:10201–10213, 2003 ) was reproduced with and without electrical coupling between complex inhibitory neurons. Although extensive coupling promotes uniform firing in complex cells, there were no detectable improvements in contrastinvariant orientation selectivity unless there were coincident changes in complex cell firing rates to offset the untuned excitatory component that grows with contrast. Thus, changes in firing rates alone (with or without coupling) could improve contrastinvariant orientation tuning of simple cells but not synchronization of complex inhibitory neurons alone. Abstract Coral polyps contract when electrically stimulated and a wave of contraction travels from the site of stimulation at a constant speed. Models of coral nerve networks were optimized to match one of three different experimentally observed behaviors. To search for model parameters that reproduce the experimental observations, we applied genetic algorithms to increasingly more complex models of a coral nerve net. In a first stage of optimization, individual neurons responded with spikes to multiple, but not single pulses of activation. In a second stage, we used these neurons as the starting point for the optimization of a twodimensional nerve net. This strategy yielded a network with parameters that reproduced the experimentally observed spread of excitation. Abstract Spikewave discharges are a distinctive feature of epileptic seizures. So far, they have not been reported in spatially extended neural field models. We study a spaceindependent version of the Amari neural field model with two competing inhibitory populations. We show that this competition leads to robust spikewave dynamics if the inhibitory populations operate on different timescales. The spikewave oscillations present a fold/homoclinic type bursting. From this result we predict parameters of the extended Amari system where spikewave oscillations produce a spatially homogeneous pattern. We propose this mechanism as a prototype of macroscopic epileptic spikewave discharges. To our knowledge this is the first example of robust spikewave patterns in a spatially extended neural field model. Inverse Current Source Density Method in Two Dimensions: Inferring Neural Activation from Multielectrode Recordings Neuroinformatics Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Spiking neural network simulation: memory-optimal synaptic event scheduling Journal of Computational Neuroscience Summary One of the more important recent additions to the NEURON simulation environment is a tool called ModelView, which simplifies the task of understanding exactly what biological attributes are represented in a computational model. Here, we illustrate how ModelView contributes to the understanding of models and discuss its utility as a neuroinformatics tool for analyzing models in online databases and as a means for facilitating interoperability among simulators in computational neuroscience. Abstract Background Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the threedimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. Results This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. Conclusions The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Precomputed data for a nonredundant set of deposited protein structures are available for analysis and download in the ModelDB database. Implementation, availability and requirements Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODELDB/MODELDB_web/MODindex.php . Operating system(s): Platform independent. Programming language: PerlBioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by nonacademics: No. Abstract Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing. Abstract This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. Abstract Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “NeuroIT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. Abstract Cells are the basic units of biological structure and functions. They make up tissues and our bodies. A single cell includes organelles and intracellular solutions, and it is separated from outer environment of extracellular liquid surrounding the cell by its cell membrane (plasma membrane), generating differences in concentrations of ions and molecules including enzymes. The differences in charges of ions and concentrations cause, respectively, electrical and chemical potentials, generating transportations of materials across the membrane. Here we look at cores of mathematical modeling associated with dynamic behaviors of single cells as well as bases of numerical simulations. Abstract Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a webaccessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB. Abstract In this chapter, usage of the insilico platform is demonstrated. The insilico platform is composed of three blocks, i.e. insilico ML, insilico IDE and insilico DB. Insilico ML (ISML) (Asai et al. 2008) is a language specification based on XML to describe mathematical models of physiological functions. Insilico IDE (ISIDE) (Kawazu et al. 2007; Suzuki et al. 2008, 2009) is a software program on which users can simulate and/or create a model with graphical representations corresponding to the concept of ISML, such as modules and edges. ISIDE also has a command line interface to manipulate large scale models based on Python, which is a powerful script computer language. ISIDE exports ISML models into C $$++$$ source codes, CellML format and FreeFEM $$++$$ format for further analysis or simulation. Insilico Sim (ISSim) (Heien et al. 2009), which is a part of ISIDE, is a simulator for models written in ISML. Insilico DB is formed from three databases, i.e. database of ISML models (Model DB), timeseries data (Timeseries DB) and morphological data (Morphology DB). These databases are open to the public at the website www.physiome.jp . Abstract Science requires that results are reproducible. This is naturally expected for wetlab experiments and it is equally important for modelbased results published in the literature. Reproducibility, in general, requires standards that provide the information necessary and tools that enable others to reuse this information. In computational biology, reproducibility requires not only a coded form of the model but also a coded form of the experimental setup to reproduce the analysis of the model. Wellestablished databases and repositories store and provide mathematical models. Recently, these databases started to distribute simulation setups together with the model code. These developments facilitate the reproduction of results. In this chapter, we outline the necessary steps towards reproducing modelbased results in computational biology. We exemplify the workflow using a prominent example model of the Cell Cycle and stateoftheart tools and standards. Abstract Citations play an important role in medical and scientific databases by indicating the authoritative source of the data. Manual citation entry is tedious and prone to errors. We describe a method and make available computer scripts which automate the process of citation entry. We use an open citation project PERL module (PARSER) for parsing citation data that is then used to retrieve PubMed records to supply the (validated) reference. Our PERL scripts are available via a link in the web references section of this article. Abstract The accurate simulation of a neuron’s ability to integrate distributed synaptic input typically requires the simultaneous solution of tens of thousands of ordinary differential equations. For, in order to understand how a cell distinguishes between input patterns we apparently need a model that is biophysically accurate down to the space scale of a single spine, i.e., 1 μm. We argue here that one can retain this highly detailed input structure while dramatically reducing the overall system dimension if one is content to accurately reproduce the associated membrane potential at a small number of places, e.g., at the site of action potential initiation, under subthreshold stimulation. The latter hypothesis permits us to approximate the active cell model with an associated quasiactive model, which in turn we reduce by both timedomain (Balanced Truncation) and frequencydomain ( ${\cal H}_2$ approximation of the transfer function) methods. We apply and contrast these methods on a suite of typical cells, achieving up to four orders of magnitude in dimension reduction and an associated speedup in the simulation of dendritic democratization and resonance. We also append a threshold mechanism and indicate that this reduction has the potential to deliver an accurate quasiintegrate and fire model. Abstract Biomedical databases are a major resource of knowledge for research in the life sciences. The biomedical knowledge is stored in a network of thousands of databases, repositories and ontologies. These data repositories differ substantially in granularity of data, storage formats, database systems, supported data models and interfaces. In order to make full use of available data resources, the high number of heterogeneous query methods and frontends requires high bioinformatic skills. Consequently, the manual inspection of database entries and citations is a timeconsuming task for which methods from computer science should be applied.Concepts and algorithms from information retrieval (IR) play a central role in facing those challenges. While originally developed to manage and query less structured data, information retrieval techniques become increasingly important for the integration of life science data repositories and associated information. This chapter provides an overview of IR concepts and their current applications in life sciences. Enriched by a high number of selected references to pursuing literature, the following sections will successively build a practical guide for biologists and bioinformaticians. Abstract NeuroML is a language based on XML for describing detailed neuronal models, which can contain multiple active conductances and complex morphologies. Networks of such cells positioned and synaptically connected in 3D can also be described. In this chapter we present an overview of the history of NeuroML, a brief description of the current version of the language, plans for future developments and the relationship to other standardisation initiatives in the wider computational neuroscience field. We also present a list of NeuroML resources which are currently available, such as language specifications, services on the NeuroML website, examples of models in this format, simulation platform support, and other applications for generating and visualising highly detailed neuronal networks. These resources illustrate how NeuroML can be a key part of the toolchain for researchers addressing complex questions of neuronal system function. Abstract We present principles for an integrated neuroinformatics framework which makes explicit how models are grounded on empirical evidence, explain (or not) existing empirical results and make testable predictions. The new ontological framework makes explicit how models bring together structural, functional, and related empirical observations. We emphasize schematics of the model’s operation linked to summaries of empirical data (SEDs) used in both the design and testing of the model, with tests comparing SEDs to summaries of simulation results (SSRs) from the model. We stress the importance of protocols for models as well as experiments. We complement the structural ontology of nested brain structures with a functional ontology of Brain Operating Principles (BOPs) for observed neural function and an ontological framework for grounding models in empirical data. We present an implementation of this ontological framework in the Brain Operation Database (BODB), an environment in which modelers and experimentalists can work together by making use of their shared empirical data, models and expertise. Abstract We assess the challenges of studying action and language mechanisms in the brain, both singly and in relation to each other to provide a novel perspective on neuroinformatics, integrating the development of databases for encoding – separately or together – neurocomputational models and empirical data that serve systems and cognitive neuroscience. Summary A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib ( The Handbook of Brain Theory and Neural Networks , pp. 741–745, 2003); Arbib and Grethe ( Computing the Brain: A Guide to Neuroinformatics , 2001); Ascoli ( Computational Neuroanatomy: Principles and Methods , 2002); Bower and Bolouri ( Computational Modeling of Genetic and Biochemical Networks , 2001); Hines et al. ( J. Comput. Neurosci. 17 , 7–11, 2004); Shepherd et al. ( Trends Neurosci. 21 , 460–468, 1998); Sivakumaran et al. ( Bioinformatics 19 , 408–415, 2003); Smolen et al. ( Neuron 26 , 567–580, 2000); Vadigepalli et al. ( OMICS 7 , 235–252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv ( J. Neurophysiol. 71 , 294–308, 1994)]. SNNAP is a versatile and userfriendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu . Conclusion ModelDB provides a resource for the computational neuroscience community that enables investigators to increase their understanding of published models by enabling them o run the models as published and build on them for further research. Its use can aid the field of computational neuroscience to enter a new era of expedited numerical experimentation. Abstract Pairedpulse inhibition (PPI) of the population spike observed in extracellular field recordings is widely used as a readout of hippocampal network inhibition. PPI reflects GABA A receptormediated inhibition of principal neurons through local interneurons. However, because of its polysynaptic nature, it is difficult to assign PPI changes to precise synaptic mechanisms. Here we used a detailed network model of the dentate gyrus to simulate PPI of granule cell action potentials and analyze its network properties. Our computational analysis indicates that PPI results mainly from a combination of perisomatic feedforward and feedback inhibition of granule cells by basket cells. Feedforward inhibition mediated by basket cells appeared to be the most significant source of PPI. Our simulations suggest that PPI depends more on somatic than on dendritic inhibition of granule cells. Furthermore, PPI was modulated by changes in GABA A reversal potential (E GABA ) and by alterations in intrinsic excitability of granule cells. In summary, computer modeling provides a useful tool for determining the role of synaptic and intrinsic cellular mechanisms in pairedpulse field potential responses. Abstract Translating basic neuroscience research into experimental neurology applications often requires functional interfacing of the central nervous system (CNS) with artificial devices designed to monitor and/or stimulate brain electrical activity. Ideally, such interfaces should provide a high temporal and spatial resolution over a large area of tissue during stimulation and/or recording of neuronal activity, with the ultimate goal to elicit/detect the electrical excitation at the singlecell level and to observe the emerging spatiotemporal correlations within a given functional area. Activity patterns generated by CNS neurons have been typically correlated with a sensory stimulus, a motor response, or a potentially cognitive process. Abstract Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semimanually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_Morpho are freely distributed (www.maths.soton.ac.uk/staff/D’Alessandro/morpho or www.krasnow.gmu.edu/LNeuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems. The aim of the study to elucidate the biophysical mechanisms able to determine specific transformations of the patterns of output signals of neurons (neuronal impulse codes) depending on the spatiotemporal organization of synaptic actions coming to the dendrites. We studied mathematical models of the neocortical layer 5 pyramidal neurons built according to the results of computer reconstruction of their dendritic arborizations and experimental data on the voltagedependent conductivities of their dendritic membrane. This work is a continuation of our previous studies that showed the existence of certain relations between the complexity of neural impulse codes, on the one hand, and the complexity, size, metrical asymmetry of branching, and nonlinear membrane properties of the dendrites, on the other hand. This relation determines synchronous (with some phase shifts) or asynchronous transitions of asymmetrical dendritic subtrees between high and low depolarization states during the generation of output impulse patterns in response to distributed tonic activation of dendritic inputs. In this work we demonstrate the first time that the appearance and pattern of transformations of complex periodical impulse trains at the neuron’s output associated with receiving a short series of presynaptic action potentials are determined not only by the time of arrival of such a series, but also by their spatial addressing to asymmetric dendritic subtrees; the latter, in this case, may be in the same (synchronous transitions) or different (asynchronous transitions) electrical states. Biophysically, this phenomenon is based on a significant excess of the driving potential for a synaptic excitatory current in lowdepolarization regions, as compared with that in highdepolarization dendritic regions receiving phasic synaptic stimuli. These findings open a novel aspect of the functioning of neurons and neuronal networks. Abstract Electrical models of neurons are one of the rather rare cases in Biology where a concise quantitative theory accounts for a huge range of observations and works well to predict and understand physiological properties. The mark of a successful theory is that people take it for granted and use it casually. Single neuronal models are no longer remarkable: with the theory well in hand, most interesting questions using models have moved to the networks of neurons in which they are embedded, and the networks of signalling pathways that are in turn embedded in neurons. Nevertheless, good singleneuron models are still rather rare and valuable entities, and it is an important goal in neuroinformatics (and this chapter) to make their generation a welltuned process.The electrical properties of single neurons can be acurately modeled using multicompartmental modeling. Such models are biologically motivated and have a close correspondence with the underlying biophysical properties of neurons and their ion channels. These multicompartment models are also important as building blocks for detailed network models. Finally, the compartmental modeling framework is also well suited for embedding molecular signaling pathway models which are important for studying synaptic plasticity. This chapter introduces the theory and practice of multicompartmental modeling. Abstract Dopaminergic neuron activity has been modeled during learning and appetitive behavior, most commonly using the temporaldifference (TD) algorithm. However, a proper representation of elapsed time and of the exact task is usually required for the model to work. Most models use timing elements such as delayline representations of time that are not biologically realistic for intervals in the range of seconds. The intervaltiming literature provides several alternatives. One of them is that timing could emerge from general network dynamics, instead of coming from a dedicated circuit. Here, we present a general ratebased learning model based on long shortterm memory (LSTM) networks that learns a time representation when needed. Using a naïve network learning its environment in conjunction with TD, we reproduce dopamine activity in appetitive trace conditioning with a constant CSUS interval, including probe trials with unexpected delays. The proposed model learns a representation of the environment dynamics in an adaptive biologically plausible framework, without recourse to delay lines or other specialpurpose circuits. Instead, the model predicts that the taskdependent representation of time is learned by experience, is encoded in ramplike changes in singleneuron activity distributed across small neural networks, and reflects a temporal integration mechanism resulting from the inherent dynamics of recurrent loops within the network. The model also reproduces the known finding that trace conditioning is more difficult than delay conditioning and that the learned representation of the task can be highly dependent on the types of trials experienced during training. Finally, it suggests that the phasic dopaminergic signal could facilitate learning in the cortex. On mathematical models of pyramidal neurons localized in the neocortical layers 2/3, whose reconstructed dendritic arborization possessed passive linear or active nonlinear membrane properties, we studied the effect of morphology of the dendrites on their passive electrical transfer characteristics and also on the formation of patterns of spike discharges at the output of the cell under conditions of tonic activation via uniformly distributed excitatory synapses along the dendrites. For this purpose, we calculated morphometric characteristics of the size, complexity, metric asymmetry, and function of effectiveness of somatopetal transmission of the current (with estimation of the sensitivity of this efficacy to changes in the uniform membrane conductance) for the reconstructed dendritic arborization in general and also for its apical and basal subtrees. Spatial maps of the membrane potential and intracellular calcium concentration, which corresponded to certain temporal patterns of spike discharges generated by the neuron upon different intensities of synaptic activation, were superimposed on the 3D image and dendrograms of the neuron. These maps were considered “spatial autographs” of the above patterns. The main discharge pattern included periodic twospike bursts (dublets) generated with relatively stable intraburst interspike intervals and interburst intervals decreasing with a rise in the intensity of activation. Under conditions of intense activation, the interburst intervals became close to the intraburst intervals, so the cell began to generate continuous trains of action potentials. Such a repertoire (consisting of two patterns of the activity, periodical dublets and continuous discharges) is considerably scantier than that described earlier in pyramidal neurons of the neocortical layer 5. Under analogous conditions of activation, we observed in the latter cells a variety of patterns of output discharges of different complexities, including stochastic ones. A relatively short length of the apical dendrite subtree of layer 2/3 neurons and, correspondingly, a smaller metric asymmetry (differences between the lengths of the apical and basal dendritic branches and paths), as compared with those in layer 5 pyramidal neurons, are morphological factors responsible for the predominance of periodic spike dublets. As a result, there were two combinations of different electrical states of the sites of dendritic arborization (“spatial autographs”). In the case of dublets, these were high depolarization of the apical dendrites vs. low depolarization of the basal dendrites and a reverse combination; only the latter (reverse) combination corresponded to the case of continuous discharges. The relative simplicity and uniformity of spike patterns in the cells, apparently, promotes the predominance of network interaction in the processes of formation of the activity of pyramidal neurons of layers 2/3 and, thereby, a higher efficiency of the processes of intracortical association. Abstract Phase precession is one of the most well known examples within the temporal coding hypothesis. Here we present a biophysical spiking model for phase precession in hippocampal CA1 which focuses on the interaction between place cells and local inhibitory interneurons. The model’s functional block is composed of a place cell (PC) connected with a local inhibitory cell (IC) which is modulated by the population theta rhythm. Both cells receive excitatory inputs from the entorhinal cortex (EC). These inputs are both theta modulated and space modulated. The dynamics of the two neuron types are described by integrateandfire models with conductance synapses, and the EC inputs are described using nonhomogeneous Poisson processes. Phase precession in our model is caused by increased drive to specific PC/IC pairs when the animal is in their place field. The excitation increases the IC’s firing rate, and this modulates the PC’s firing rate such that both cells precess relative to theta. Our model implies that phase coding in place cells may not be independent from rate coding. The absence of restrictive connectivity constraints in this model predicts the generation of phase precession in any network with similar architecture and subject to a clocking rhythm, independently of the involvement in spatial tasks. Abstract We have discussed several types of active (voltagegated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models. Abstract Detailed cell and network morphologies are becoming increasingly important in Computational Neuroscience. Great efforts have been undertaken to systematically record and store the anatomical data of cells. This effort is visible in databases, such as NeuroMorpho.org . In order to make use of these fast growing data within computational models of networks, it is vital to include detailed data of morphologies when generating those cell and network geometries. For this purpose we developed the Neuron Network Generator NeuGen 2.0 , that is designed to include known and published anatomical data of cells and to automatically generate large networks of neurons. It offers export functionality to classic simulators, such as the NEURON Simulator by Hines and Carnevale ( 2003 ). NeuGen 2.0 is designed in a modular way, so any new and available data can be included into NeuGen 2.0 . Also, new brain areas and cell types can be defined with the possibility of constructing userdefined cell types and networks. Therefore, NeuGen 2.0 is a software package that grows with each new piece of anatomical data, which subsequently will continue to increase the morphological detail of automatically generated networks. In this paper we introduce NeuGen 2.0 and apply its functionalities to the CA1 hippocampus. Runtime and memory benchmarks show that NeuGen 2.0 is applicable to generating very large networks, with high morphological detail. Abstract This chapter provides a brief history of the development of software for simulating biologically realistic neurons and their networks, beginning with the pioneering work of Hodgkin and Huxley and others who developed the computational models and tools that are used today. I also present a personal and subjective view of some of the issues that came up during the development of GENESIS, NEURON, and other general platforms for neural simulation. This is with the hope that developers and users of the next generation of simulators can learn from some of the good and bad design elements of the last generation. New simulator architectures such as GENESIS 3 allow the use of standard wellsupported external modules or specialized tools for neural modeling that are implemented independently from the means of the running the model simulation. This allows not only sharing of models but also sharing of research tools. Other promising recent developments during the past few years include standard simulatorindependent declarative representations for neural models, the use of modern scripting languages such as Python in place of simulatorspecific ones and the increasing use of opensource software solutions. Abstract Modeling is a means for integrating the results from Genomics, Transcriptomics, Proteomics, and Metabolomics experiments and for gaining insights into the interaction of the constituents of biological systems. However, sharing such large amounts of frequently heterogeneous and distributed experimental data needs both standard data formats and public repositories. Standardization and a public storage system are also important for modeling due to the possibility of sharing models irrespective of the used software tools. Furthermore, rapid model development strongly benefits from available software packages that relieve the modeler of recurring tasks like numerical integration of rate equations or parameter estimation.In this chapter, the most common standard formats used for model encoding and some of the major public databases in this scientific field are presented. The main features of currently available modeling software are discussed and proposals for the application of such tools are given. Abstract When a multicompartment neuron is divided into subtrees such that no subtree has more than two connection points to other subtrees, the subtrees can be on different processors and the entire system remains amenable to direct Gaussian elimination with only a modest increase in complexity. Accuracy is the same as with standard Gaussian elimination on a single processor. It is often feasible to divide a 3D reconstructed neuron model onto a dozen or so processors and experience almost linear speedup. We have also used the method for purposes of load balance in network simulations when some cells are so large that their individual computation time is much longer than the average processor computation time or when there are many more processors than cells. The method is available in the standard distribution of the NEURON simulation program. Conclusion The Axiope team has found a well defined niche in the neuroscience software environment and is in the process of writing a software suite that may fill it. It is too early to say whether they will succeed as the main components of the software suite are not yet available. However they may fare, they have thrown the gauntlet to the neuroscience community: “Tools for efficient data analysis are coming online: will you use them?” Abstract The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a twodimensional grid using multielectrode rectangular arrays. This new method, which we call twodimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one and threedimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include systemspecific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUIbased MATLAB toolbox to analyze and visualize our test data as well as user datasets. Abstract Under sustained input current of increasing strength neurons eventually stop firing, entering a depolarization block. This is a robust effect that is not usually explored in experiments or explicitly implemented or tested in models. However, the range of current strength needed for a depolarization block could be easily reached with a random background activity of only a few hundred excitatory synapses. Depolarization block may thus be an important property of neurons that should be better characterized in experiments and explicitly taken into account in models at all implementation scales. Here we analyze the spiking dynamics of CA1 pyramidal neuron models using the same set of ionic currents on both an accurate morphological reconstruction and on its reduction to a singlecompartment. The results show the specific ion channel properties and kinetics that are needed to reproduce the experimental findings, and how their interplay can drastically modulate the neuronal dynamics and the input current range leading to a depolarization block. We suggest that this can be one of the ratelimiting mechanisms protecting a CA1 neuron from excessive spiking activity. Abstract Neuronal recordings and computer simulations produce ever growing amounts of data, impeding conventional analysis methods from keeping pace. Such large datasets can be automatically analyzed by taking advantage of the wellestablished relational database paradigm. Raw electrophysiology data can be entered into a database by extracting its interesting characteristics (e.g., firing rate). Compared to storing the raw data directly, this database representation is several orders of magnitude higher efficient in storage space and processing time. Using two large electrophysiology recording and simulation datasets, we demonstrate that the database can be queried, transformed and analyzed. This process is relatively simple and easy to learn because it takes place entirely in Matlab, using our database analysis toolbox, PANDORA. It is capable of acquiring data from common recording and simulation platforms and exchanging data with external database engines and other analysis toolboxes, which make analysis simpler and highly interoperable. PANDORA is available to be freely used and modified because it is opensource ( http://software.incf.org/software/pandora/home ). Abstract This chapter is devoted to the detailed discussion of several numerical simulations wherein we use a model to generate data, and then we examine how well we can use L = 1, 2, … of the time series for state variables of the model to estimate fixed parameters within the model and the time series of the state variables not presented to or known to the model. These are “twin experiments” and have often been used to exercise the methods one adopts for approximating the path integral for the statistical data assimilation problem. Abstract Sensitization of the defensive shortening reflex in the leech has been linked to a segmentally repeated trisynaptic positive feedback loop. Serotonin from the Rcell enhances Scell excitability, Scell impulses cross an electrical synapse into the Cinterneuron, and the Cinterneuron excites the Rcell via a glutamatergic synapse. The Cinterneuron has two unusual characteristics. First, impulses take longer to propagate from the S soma to the C soma than in the reverse direction. Second, impulses recorded from the electrically unexcitable C soma vary in amplitude when extracellular divalent cation concentrations are elevated, with smaller impulses failing to induce synaptic potentials in the Rcell. A compartmental, computational model was developed to test the sufficiency of multiple, independent spike initiation zones in the Cinterneuron to explain these observations. The model displays asymmetric delays in impulse propagation across the S–C electrical synapse and graded impulse amplitudes in the Cinterneuron in simulated high divalent cation concentrations. Abstract Before we delve into the general structure of using information from measurements to complete models of those measurements, we will illustrate many of the questions involved by taking a look at some welltrodden ground. Completing a model means that we have estimated all the unknown parameters in the model, allowing us to predict the development of the model in its state space given a set of initial conditions and a statement of the forces acting to drive it. Abstract Significant inroads have been made to understand cerebellar cortical processing but neural coding at the output stage of the cerebellum in the deep cerebellar nuclei (DCN) remains poorly understood. The DCN are unlikely to just present a relay nucleus because Purkinje cell inhibition has to be turned into an excitatory output signal, and DCN neurons exhibit complex intrinsic properties. In particular, DCN neurons exhibit a range of rebound spiking properties following hyperpolarizing current injection, raising the question how this could contribute to signal processing in behaving animals. Computer modeling presents an ideal tool to investigate how intrinsic voltagegated conductances in DCN neurons could generate the heterogeneous firing behavior observed, and what input conditions could result in rebound responses. To enable such an investigation we built a compartmental DCN neuron model with a full dendritic morphology and appropriate active conductances. We generated a good match of our simulations with DCN current clamp data we recorded in acute slices, including the heterogeneity in the rebound responses. We then examined how inhibitory and excitatory synaptic input interacted with these intrinsic conductances to control DCN firing. We found that the output spiking of the model reflected the ongoing balance of excitatory and inhibitory input rates and that changing the level of inhibition performed an additive operation. Rebound firing following strong Purkinje cell input bursts was also possible, but only if the chloride reversal potential was more negative than −70 mV to allow deinactivation of rebound currents. Fast rebound bursts due to Ttype calcium current and slow rebounds due to persistent sodium current could be differentially regulated by synaptic input, and the pattern of these rebounds was further influenced by HCN current. Our findings suggest that active properties of DCN neurons could play a crucial role for signal processing in the cerebellum. Abstract Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologicallyrealistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multicompartmental neurons with biophysical properties such as membrane potential, voltagegated and ligandgated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, localfield potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plugin development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform. Abstract No Abstract Available Abstract We have developed a simulation tool within the NEURON simulator to assist in organization, verification, and analysis of simulations. This tool, denominated Neural Query System (NQS), provides a relational database system, a query function based on the SELECT function of Structured Query Language, and datamining tools. We show how NQS can be used to organize, manage, verify, and visualize parameters for both single cell and network simulations. We demonstrate an additional use of NQS to organize simulation output and relate outputs to parameters in a network model. The NQS software package is available at http://senselab. med.yale.edu/senselab/SimToolDB. *** DIRECT SUPPORT *** A11U5014 00003 Abstract Networks of cells form tissues and organs, where aggregations of cells operate as systems. It is similar to how single cells function as systems of protein networks, where, for example, ion channel currents of a single cell are integrated to produce a whole cell membrane potential. A cell in a network may behave differently from what it does alone. Dynamics of a single cell affect to those of others and vice versa, that is, cells interact with each other. Interactions are made by different mechanisms. Cardiac cells forming a cardiac tissues and heart interact electrochemically through celltocell connections called gap junctions , by which an action potential generated at the sinoatrial node conducts through the heart, allowing coordinated muscle contractions from the atrium to the ventricle. They interact also mechanically because every cell contracts mechanically to produce heart beats. Neuronal cells in the nervous system interact via chemical synapses , by which neuronal networks exhibit spatiotemporal spiking dynamics, representing neural information. In a neuronal network in charge of movement control of a musculoskeletal system, such spatiotemporal dynamics directly correspond to coordinated contractions of a number of skeletal muscles so that a desired motion of limbs can be performed. This chapter illustrates several mathematical techniques through examples from modeling of cellular networks. Abstract Despite the central position of CA3 pyramidal cells in the hippocampal circuit, the experimental investigation of their synaptic properties has been limited. Recent slice experiments from adult rats characterized AMPA and NMDA receptor unitary synaptic responses in CA3b pyramidal cells. Here, excitatory synaptic activation is modeled to infer biophysical parameters, aid analysis interpretation, explore mechanisms, and formulate predictions by contrasting simulated somatic recordings with experimental data. Reconstructed CA3b pyramidal cells from the public repository NeuroMorpho.Org were used to allow for cellspecific morphological variation. For each cell, synaptic responses were simulated for perforant pathway and associational/commissural synapses. Means and variability for peak amplitude, timetopeak, and halfheight width in these responses were compared with equivalent statistics from experimental recordings. Synaptic responses mediated by AMPA receptors are best fit with properties typical of previously characterized glutamatergic receptors where perforant path synapses have conductances twice that of associational/commissural synapses (0.9 vs. 0.5 nS) and more rapid peak times (1.0 vs. 3.3 ms). Reanalysis of passivecell experimental traces using the model shows no evidence of a CA1like increase of associational/commissural AMPA receptor conductance with increasing distance from the soma. Synaptic responses mediated by NMDA receptors are best fit with rapid kinetics, suggestive of NR2A subunits as expected in mature animals. Predictions were made for passivecell current clamp recordings, combined AMPA and NMDA receptor responses, and local dendritic depolarization in response to unitary stimulations. Models of synaptic responses in active cells suggest altered axial resistivity and the presence of synaptically activated potassium channels in spines. Abstract What is the role of higherorder spike correlations for neuronal information processing? Common data analysis methods to address this question are devised for the application to spike recordings from multiple single neurons. Here, we present a new method which evaluates the subthreshold membrane potential fluctuations of one neuron, and infers higherorder correlations among the neurons that constitute its presynaptic population. This has two important advantages: Very large populations of up to several thousands of neurons can be studied, and the spike sorting is obsolete. Moreover, this new approach truly emphasizes the functional aspects of higherorder statistics, since we infer exactly those correlations which are seen by a neuron. Our approach is to represent the subthreshold membrane potential fluctuations as presynaptic activity filtered with a fixed kernel, as it would be the case for a leaky integrator neuron model. This allows us to adapt the recently proposed method CuBIC (cumulant based inference of higherorder correlations from the population spike count; Staude et al., J Comput Neurosci 29(1–2):327–350, 2010c ) with which the maximal order of correlation can be inferred. By numerical simulation we show that our new method is reasonably sensitive to weak higherorder correlations, and that only short stretches of membrane potential are required for their reliable inference. Finally, we demonstrate its remarkable robustness against violations of the simplifying assumptions made for its construction, and discuss how it can be employed to analyze in vivo intracellular recordings of membrane potentials. Abstract The precise mapping of how complex patterns of synaptic inputs are integrated into specific patterns of spiking output is an essential step in the characterization of the cellular basis of network dynamics and function. Relative to other principal neurons of the hippocampus, the electrophysiology of CA1 pyramidal cells has been extensively investigated. Yet, the precise inputoutput relationship is to date unknown even for this neuronal class. CA1 pyramidal neurons receive laminated excitatory inputs from three distinct pathways: recurrent CA1 collaterals on basal dendrites, CA3 Schaffer collaterals, mostly on oblique and proximal apical dendrites, and entorhinal perforant pathway on distal apical dendrites. We implemented detailed computer simulations of pyramidal cell electrophysiology based on threedimensional anatomical reconstructions and compartmental models of available biophysical properties from the experimental literature. To investigate the effect of synaptic input on axosomatic firing, we stochastically distributed a realistic number of excitatory synapses in each of the three dendritic layers. We then recorded the spiking response to different stimulation patterns. For all dendritic layers, synchronous stimuli resulted in trains of spiking output and a linear relationship between input and output firing frequencies. In contrast, asynchronous stimuli evoked nonbursting spike patterns and the corresponding firing frequency inputoutput function was logarithmic. The regular/irregular nature of the input synaptic intervals was only reflected in the regularity of output interburst intervals in response to synchronous stimulation, and never affected firing frequency. Synaptic stimulations in the basal and proximal apical trees across individual neuronal morphologies yielded remarkably similar inputoutput relationships. Results were also robust with respect to the detailed distributions of dendritic and synaptic conductances within a plausible range constrained by experimental evidence. In contrast, the inputoutput relationship in response to distal apical stimuli showed dramatic differences from the other dendritic locations as well as among neurons, and was more sensible to the exact channel densities. Abstract Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in communitysupported and standardised formats. In addition, the models and their components should be crossreferenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freelyaccessible online resource for storing, viewing, retrieving, and analysing published, peerreviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access uptodate data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. Abstract How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and worldknowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuomotor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eyetracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentencepicture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community. Abstract Background The "inverse" problem is related to the determination of unknown causes on the bases of the observation of their effects. This is the opposite of the corresponding "direct" problem, which relates to the prediction of the effects generated by a complete description of some agencies. The solution of an inverse problem entails the construction of a mathematical model and takes the moves from a number of experimental data. In this respect, inverse problems are often illconditioned as the amount of experimental conditions available are often insufficient to unambiguously solve the mathematical model. Several approaches to solving inverse problems are possible, both computational and experimental, some of which are mentioned in this article. In this work, we will describe in details the attempt to solve an inverse problem which arose in the study of an intracellular signaling pathway. Results Using the Genetic Algorithm to find the suboptimal solution to the optimization problem, we have estimated a set of unknown parameters describing a kinetic model of a signaling pathway in the neuronal cell. The model is composed of mass action ordinary differential equations, where the kinetic parameters describe proteinprotein interactions, protein synthesis and degradation. The algorithm has been implemented on a parallel platform. Several potential solutions of the problem have been computed, each solution being a set of model parameters. A subset of parameters has been selected on the basis on their small coefficient of variation across the ensemble of solutions. Conclusion Despite the lack of sufficiently reliable and homogeneous experimental data, the genetic algorithm approach has allowed to estimate the approximate value of a number of model parameters in a kinetic model of a signaling pathway: these parameters have been assessed to be relevant for the reproduction of the available experimental data. Abstract Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, areaspecific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic and perisomatictargeting. We report two lines of results: addressing the network structure capable of generating thetamodulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, thetamodulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axodendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. Abstract The brain is extraordinarily complex, containing 10 11 neurons linked with 10 14 connections. We can improve our understanding of individual neurons and neuronal networks by describing their behavior in mathematical and computational models. This chapter provides an introduction to neural modeling, laying the foundation for several basic models and surveying key topics. After some discussion on the motivations of modelers and the uses of neural models, we explore the properties of electrically excitable membranes. We describe in some detail the Hodgkin–Huxley model, the first neural model to describe biophysically the behavior of biological membranes. We explore how this model can be extended to describe a variety of excitable membrane behaviors, including axonal propagation, dendritic processing, and synaptic communication. This chapter also covers mathematical models that replicate basic neural behaviors through more abstract mechanisms. We briefly explore efforts to extend singleneuron models to the network level and provide several examples of insights gained through this process. Finally, we list common resources, including modeling environments and repositories, that provide the guidance and parameter sets necessary to begin building neural models. Abstract We have developed a program NeuroText to populate the neuroscience databases in SenseLab (http://senselab.med.yale.edu/senselab) by mining the natural language text of neuroscience articles. NeuroText uses a twostep approach to identify relevant articles. The first step (preprocessing), aimed at 100% sensitivity, identifies abstracts containing database keywords. In the second step, potentially relveant abstracts identified in the first step are processed for specificity dictated by database architecture, and neuroscience, lexical and semantic contexts. NeuroText results were presented to the experts for validation using a dynamically generated interface that also allows expertvalidated articles to be automatically deposited into the databases. Of the test set of 912 articles, 735 were rejected at the preprocessing step. For the remaining articles, the accuracy of predicting databaserelevant articles was 85%. Twentytwo articles were erroneously identified. NeuroText deferred decisions on 29 articles to the expert. A comparison of NeuroText results versus the experts’ analyses revealed that the program failed to correctly identify articles’ relevance due to concepts that did not yet exist in the knowledgebase or due to vaguely presented information in the abstracts. NeuroText uses two “evolution” techniques (supervised and unsupervised) that play an important role in the continual improvement of the retrieval results. Software that uses the NeuroText approach can facilitate the creation of curated, specialinterest, bibliography databases. Abstract Dendrites play an important role in neuronal function and connectivity. This chapter introduces the first section of the book focusing on the morphological features of dendritic tree structures and the role of dendritic trees in the circuit. We provide an overview of quantitative procedures for data collection, analysis, and modeling of dendrite shape. Our main focus lies on the description of morphological complexity and how one can use this description to unravel neuronal function in dendritic trees and neural circuits. Abstract The chapter is organised in two parts: In the first part, the focus is on a combined power spectral and nonlinear behavioural analysis of a neural mass model of the thalamocortical circuitry. The objective is to study the effectiveness of such “multimodal” analytical techniques in modelbased studies investigating the neural correlates of abnormal brain oscillations in Alzheimer’s disease (AD). The power spectral analysis presented here is a study of the “slowing” (decreasing dominant frequency of oscillation) within the alpha frequency band (8–13 Hz), a hallmark of electroencephalogram (EEG) dynamics in AD. Analysis of the nonlinear dynamical behaviour focuses on the bifurcating property of the model. The results show that the alpha rhythmic content is maximal at close proximity to the bifurcation point—an observation made possible by the “multimodal” approach adopted herein. Furthermore, a slowing in alpha rhythm is observed for increasing inhibitory connectivity—a consistent feature of our research into neuropathological oscillations associated with AD. In the second part, we have presented power spectral analysis on a model that implements multiple feedforward and feedback connectivities in the thalamocorticothalamic circuitry, and is thus more advanced in terms of biological plausibility. This study looks at the effects of synaptic connectivity variation on the power spectra within the delta (1–3 Hz), theta (4–7 Hz), alpha (8–13 Hz) and beta (14–30 Hz) bands. An overall slowing of EEG with decreasing synaptic connectivity is observed, indicated by a decrease of power within alpha and beta bands and increase in power within the theta and delta bands. Thus, the model behaviour conforms to longitudinal studies in AD indicating an overall slowing of EEG. Abstract Neuronal processes grow under a variety of constraints, both immediate and evolutionary. Their pattern of growth provides insight into their function. This chapter begins by reviewing morphological metrics used in analyses and computational models. Molecular mechanisms underlying growth and plasticity are then discussed, followed by several types of modeling approaches. Computer simulation of morphology can be used to describe and reproduce the statistics of neuronal types or to evaluate growth and functional hypotheses. For instance, models in which branching is probabilistically determined by diameter produce realistic virtual dendrites of most neuronal types, though more complicated statistical models are required for other types. Virtual dendrites grown under environmental and/or functional constraints are also discussed, offering a broad perspective on dendritic morphology. Abstract Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers. Abstract Recently, a class of twodimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045–1079, 2008 ). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasisteady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed. Conclusions Our proposed database schema for managing heterogeneous data is a significant departure from conventional approaches. It is suitable only when the following conditions hold: • The number of classes of entity is numerous, while the number of actual instances in most classes is expected to be very modest. • The number (and nature) of the axes describing an arbitrary fact (as an Nary association) varies greatly. We believe that nervous system data is an appropriate problem domain to test such an approach. Abstract Stereotactic human brain atlases, either in print or electronic form, are useful not only in functional neurosurgery, but also in neuroradiology, human brain mapping, and neuroscience education. The existing atlases represent structures on 2D plates taken at variable, often large intervals, which limit their applications. To overcome this problem, we propose ahybrid interpolation approach to build highresolution brain atlases from the existing ones. In this approach, all section regions of each object are grouped into two types of components: simple and complex. A NURBSbased method is designed for interpolation of the simple components, and a distance mapbased method for the complex components. Once all individual objects in the atlas are interpolated, the results are combined hierarchically in a bottomup manner to produce the interpolation of the entire atlas. In the procedure, different knowledgebased and heuristic strategies are used to preserve various topological relationships. The proposed approach has been validated quantitatively and used for interpolation of two stereotactic brain atlases: the TalairachTournouxatlas and SchaltenbrandWahren atlas. The interpolations produced are of high resolution and feature high accuracy, 3D consistency, smooth surface, and preserved topology. They potentially open new applications for electronic stereotactic brain atlases, such as atlas reformatting, accurate 3D display, and 3D nonlinear warping against normal and pathological scans. The proposed approach is also potentially useful in other applications, which require interpolation and 3D modeling from sparse and/or variable intersection interval data. An example of 3D modeling of an infarct from MR diffusion images is presented. Abstract Quantitative neuroanatomical data are important for the study of many areas of neuroscience, and the complexity of problems associated with neuronal structure requires that research from multiple groups across many disciplines be combined. However, existing neurontracing systems, simulation environments, and tools for the visualization and analysis of neuronal morphology data use a variety of data formats, making it difficult to exchange data in a readily usable way. The NeuroML project was initiated to address these issues, and here we describe an extensible markup language standard, MorphML, which defines a common data format for neuronal morphology data and associated metadata to facilitate data and model exchange, database creation, model publication, and data archiving. We describe the elements of the standard in detail and outline the mappings between this format and those used by a number of popular applications for reconstruction, simulation, and visualization of neuronal morphology. Abstract A major part of biology has become a class of physical and mathematical sciences. We have started to feel, though still a little suspicious yet, that it will become possible to predict biological events that will happen in the future of one’s life and to control some of them if desired so, based upon the understanding of genomic information of individuals and physical and chemical principles governing physiological functions of living organisms at multiple scale and level, from molecules to cells and organs. Abstract A halfcenter oscillator (HCO) is a common circuit building block of central pattern generator networks that produce rhythmic motor patterns in animals. Here we constructed an efficient relational database table with the resulting characteristics of the Hill et al.’s (J Comput Neurosci 10:281–302, 2001 ) HCO simple conductancebased model. The model consists of two reciprocally inhibitory neurons and replicates the electrical activity of the oscillator interneurons of the leech heartbeat central pattern generator under a variety of experimental conditions. Our longrange goal is to understand how this basic circuit building block produces functional activity under a variety of parameter regimes and how different parameter regimes influence stability and modulatability. By using the latest developments in computer technology, we simulated and stored large amounts of data (on the order of terabytes). We systematically explored the parameter space of the HCO and corresponding isolated neuron models using a bruteforce approach. We varied a set of selected parameters (maximal conductance of intrinsic and synaptic currents) in all combinations, resulting in about 10 million simulations. We classified these HCO and isolated neuron model simulations by their activity characteristics into identifiable groups and quantified their prevalence. By querying the database, we compared the activity characteristics of the identified groups of our simulated HCO models with those of our simulated isolated neuron models and found that regularly bursting neurons compose only a small minority of functional HCO models; the vast majority was composed of spiking neurons. Abstract This paper describes how an emerging standard neural network modelling language can be used to configure a generalpurpose neural multichip system by describing the process of writing and loading neural network models on the SpiNNaker neuromimetic hardware. It focuses on the implementation of a SpiNNaker module for PyNN, a simulatorindependent language for neural networks modelling. We successfully extend PyNN to deal with different nonstandard (eg. Izhikevich) cell types, rapidly switch between them and load applications on a parallel hardware by orchestrating the software layers below it, so that they will be abstracted to the final user. Finally we run some simulations in PyNN and compare them against other simulators, successfully reproducing single neuron and network dynamics and validating the implementation. Abstract The present study examines the biophysical properties and functional implications of I h in hippocampal area CA3 interneurons with somata in strata radiatum and lacunosummoleculare . Characterization studies showed a small maximum hconductance (2.6 ± 0.3 nS, n  = 11), shallow voltage dependence with a hyperpolarized halfmaximal activation ( V 1/2  = −91 mV), and kinetics characterized by doubleexponential functions. The functional consequences of I h were examined with regard to temporal summation and impedance measurements. For temporal summation experiments, 5pulse mossy fiber input trains were activated. Blocking I h with 50 μM ZD7288 resulted in an increase in temporal summation, suggesting that I h supports sensitivity of response amplitude to relative input timing. Impedance was assessed by applying sinusoidal current commands. From impedance measurements, we found that I h did not confer thetaband resonance, but flattened the impedance–frequency relations instead. Double immunolabeling for hyperpolarizationactivated cyclic nucleotidegated proteins and glutamate decarboxylase 67 suggests that all four subunits are present in GABAergic interneurons from the strata considered for electrophysiological studies. Finally, a model of I h was employed in computational analyses to confirm and elaborate upon the contributions of I h to impedance and temporal summation. Abstract Modelling and simulation methods gain increasing importance for the understanding of biological systems. The growing number of available computational models makes support in maintenance and retrieval of those models essential to the community. This article discusses which model information are helpful for efficient retrieval and how existing similarity measures and ranking techniques can be used to enhance the retrieval process, i. e. the model reuse. With the development of new tools and modelling formalisms, there also is an increasing demand for performing search independent of the models’ encoding. Therefore, the presented approach is not restricted to certain model storage formats. Instead, the model metainformation is used for retrieval and ranking of the search result. Metainformation include general information about the model, its encoded species and reactions, but also information about the model behaviour and related simulation experiment descriptions. Abstract To understand the details of brain function, a large scale system model that reflects anatomical and neurophysiological characteristics needs to be implemented. Though numerous computational models of different brain areas have been proposed, these integration for the development of a large scale model have not yet been accomplished because these models were described by different programming languages, and mostly because they used different data formats. This paper introduces a platform for a collaborative brain system modeling (PLATO) where one can construct computational models using several programming languages and connect them at the I/O level with a common data format. As an example, a whole visual system model including eye movement, eye optics, retinal network and visual cortex is being developed. Preliminary results demonstrate that the integrated model successfully simulates the signal processing flow at the different stages of visual system. Abstract Brain rhythms are the most prominent signal measured noninvasively in humans with magneto/electroencephalography (MEG/EEG). MEG/EEG measured rhythms have been shown to be functionally relevant and signature changes are used as markers of disease states. Despite the importance of understanding the underlying neural mechanisms creating these rhythms, relatively little is known about their in vivo origin in humans. There are obvious challenges in linking the extracranially measured signals directly to neural activity with invasive studies in humans, and although animal models are well suited for such studies, the connection to human brain function under cognitively relevant tasks is often lacking. Biophysically principled computational neural modeling provides an attractive means to bridge this critical gap. Here, we describe a method for creating a computational neural model capturing the laminar structure of cortical columns and how this model can be used to make predictions on the cellular and circuit level mechanisms of brain oscillations measured with MEG/EEG. Specifically, we describe how the model can be used to simulate current dipole activity, the common macroscopic signal inferred from MEG/EEG data. We detail the development and application of the model to study the spontaneous somatosensory murhythm, containing mualpha (7–14 Hz) and mubeta (15–29 Hz) components. We describe a novel prediction on the neural origin on the murhythm that accurately reproduces many characteristic features of MEG data and accounts for changes in the rhythm with attention, detection, and healthy aging. While the details of the model are specific to the somatosensory system, the model design and application are based on general principles of cortical circuitry and MEG/EEG physics, and are thus amenable to the study of rhythms in other frequency bands and sensory systems. Abstract GABAergic interneurons in cortical circuits control the activation of principal cells and orchestrate network activity patterns, including oscillations at different frequency ranges. Recruitment of interneurons depends on integration of convergent synaptic inputs along the dendrosomatic axis; however, dendritic processing in these cells is still poorly understood.In this chapter, we summarise our results on the cable properties, electrotonic structure and dendritic processing in “basket cells” (BCs; Nörenberg et al. 2010), one of the most prevalent types of cortical interneurons mediating perisomatic inhibition. In order to investigate integrative properties, we have performed twoelectrode wholecell patch clamp recordings, visualised and reconstructed the recorded interneurons and created passive singlecell models with biophysical properties derived from the experiments. Our results indicate that membrane properties, in particular membrane resistivity, are inhomogeneous along the somatodendritic axis of the cell. Derived values and the gradient of membrane resistivity are different from those obtained for excitatory principal cells. The divergent passive membrane properties of BCs facilitate rapid signalling from proximal basal dendritic inputs but at the same time increase synapsetosoma transfer for slow signals from the distal apical dendrites.Our results demonstrate that BCs possess distinct integrative properties. Future computational models investigating the diverse functions of neuronal circuits need to consider this diversity and incorporate realistic dendritic properties not only of excitatory principal cells but also various types of inhibitory interneurons. Abstract New surgical and localization techniques allow for precise and personalized evaluation and treatment of intractable epilepsies. These techniques include the use of subdural and depth electrodes for localization, and the potential use for celltargeted stimulation using optogenetics as part of treatment. Computer modeling of seizures, also individualized to the patient, will be important in order to make full use of the potential of these new techniques. This is because epilepsy is a complex dynamical disease involving multiple scales across both time and space. These complex dynamics make prediction extremely difficult. Cause and effect are not cleanly separable, as multiple embedded causal loops allow for many scales of unintended consequence. We demonstrate here a small model of sensory neocortex which can be used to look at the effects of microablations or microstimulation. We show that ablations in this network can either prevent spread or prevent occurrence of the seizure. In this example, focal electrical stimulation was not able to terminate a seizure but selective stimulation of inhibitory cells, a future possibility through use of optogenetics, was efficacious. Abstract The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophysiological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both views on the intrapallidal projection, by providing a plausible characterization of the relationship between the external and internal globus pallidus. Specifically, we developed a meanfield model of the whole basal ganglia, whose parameterization is optimized to respect best a collection of numerous anatomical and electrophysiological data. We first obtained models respecting all our constraints, hence anatomical and electrophysiological data on the intrapallidal projection are globally consistent. This model furthermore predicts that both aforementioned views about the intrapallidal projection may be reconciled when this projection is weakly inhibitory, thus making it possible to support similar neural activity in both nuclei and for the entire basal ganglia to select between actions. Second, we predicts that afferent projections are substantially unbalanced towards the external segment, as it receives the strongest excitation from STN and the weakest inhibition from the striatum. Finally, our study strongly suggests that the intrapallidal connection pattern is not focused but diffuse, as this latter pattern is more efficient for the overall selection performed in the basal ganglia. Abstract Background The information coming from biomedical ontologies and computational pathway models is expanding continuously: research communities keep this process up and their advances are generally shared by means of dedicated resources published on the web. In fact, such models are shared to provide the characterization of molecular processes, while biomedical ontologies detail a semantic context to the majority of those pathways. Recent advances in both fields pave the way for a scalable information integration based on aggregate knowledge repositories, but the lack of overall standard formats impedes this progress. Indeed, having different objectives and different abstraction levels, most of these resources "speak" different languages. Semantic web technologies are here explored as a means to address some of these problems. Methods Employing an extensible collection of interpreters, we developed OREMP (Ontology Reasoning Engine for Molecular Pathways), a system that abstracts the information from different resources and combines them together into a coherent ontology. Continuing this effort we present OREMPdb; once different pathways are fed into OREMP, species are linked to the external ontologies referred and to reactions in which they participate. Exploiting these links, the system builds speciessets, which encapsulate species that operate together. Composing all of the reactions together, the system computes all of the reaction paths fromandto all of the speciessets. Results OREMP has been applied to the curated branch of BioModels (2011/04/15 release) which overall contains 326 models, 9244 reactions, and 5636 species. OREMPdb is the semantic dictionary created as a result, which is made of 7360 speciessets. For each one of these sets, OREMPdb links the original pathway and the link to the original paper where this information first appeared. Abstract Conductancebased neuron models are frequently employed to study the dynamics of biological neural networks. For speed and ease of use, these models are often reduced in morphological complexity. Simplified dendritic branching structures may process inputs differently than full branching structures, however, and could thereby fail to reproduce important aspects of biological neural processing. It is not yet well understood which processing capabilities require detailed branching structures. Therefore, we analyzed the processing capabilities of full or partially branched reduced models. These models were created by collapsing the dendritic tree of a full morphological model of a globus pallidus (GP) neuron while preserving its total surface area and electrotonic length, as well as its passive and active parameters. Dendritic trees were either collapsed into single cables (unbranched models) or the full complement of branch points was preserved (branched models). Both reduction strategies allowed us to compare dynamics between all models using the same channel density settings. Full model responses to somatic inputs were generally preserved by both types of reduced model while dendritic input responses could be more closely preserved by branched than unbranched reduced models. However, features strongly influenced by local dendritic input resistance, such as active dendritic sodium spike generation and propagation, could not be accurately reproduced by any reduced model. Based on our analyses, we suggest that there are intrinsic differences in processing capabilities between unbranched and branched models. We also indicate suitable applications for different levels of reduction, including fast searches of full model parameter space. Summary Processing text from scientific literature has become a necessity due to the burgeoning amounts of information that are fast becoming available, stemming from advances in electronic information technology. We created a program, NeuroText ( http://senselab.med.yale.edu/textmine/neurotext.pl ), designed specifically to extract information relevant to neurosciencespecific databases, NeuronDB and CellPropDB ( http://senselab.med.yale.edu/senselab/ ), housed at the Yale University School of Medicine. NeuroText extracts relevant information from the Neuroscience literature in a twostep process: each step parses text at different levels of granularity. NeuroText uses an expertmediated knowledgebase and combines the techniques of indexing, contextual parsing, semantic and lexical parsing, and supervised and nonsupervised learning to extract information. The constrains, metadata elements, and rules for information extraction are stored in the knowledgebase. NeuroText was created as a pilot project to process 3 years of publications in Journal of Neuroscience and was subsequently tested for 40,000 PubMed abstracts. We also present here a template to create domain nonspecific knowledgebase that when linked to a textprocessing tool like NeuroText can be used to extract knowledge in other fields of research. Abstract Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a HodgkinHuxley (HH) axon to be changed. The aim of this work is to develop a didactic and easytouse computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. Abstract Grid cells (GCs) in the medial entorhinal cortex (mEC) have the property of having their firing activity spatially tuned to a regular triangular lattice. Several theoretical models for grid field formation have been proposed, but most assume that place cells (PCs) are a product of the grid cell system. There is, however, an alternative possibility that is supported by various strands of experimental data. Here we present a novel model for the emergence of gridlike firing patterns that stands on two key hypotheses: (1) spatial information in GCs is provided from PC activity and (2) grid fields result from a combined synaptic plasticity mechanism involving inhibitory and excitatory neurons mediating the connections between PCs and GCs. Depending on the spatial location, each PC can contribute with excitatory or inhibitory inputs to GC activity. The nature and magnitude of the PC input is a function of the distance to the place field center, which is inferred from rate decoding. A biologically plausible learning rule drives the evolution of the connection strengths from PCs to a GC. In this model, PCs compete for GC activation, and the plasticity rule favors efficient packing of the space representation. This leads to gridlike firing patterns. In a new environment, GCs continuously recruit new PCs to cover the entire space. The model described here makes important predictions and can represent the feedforward connections from hippocampus CA1 to deeper mEC layers. Abstract Because of its highly branched dendrite, the Purkinje neuron requires significant computational resources if coupled electrical and biochemical activity are to be simulated. To address this challenge, we developed a scheme for reducing the geometric complexity; while preserving the essential features of activity in both the soma and a remote dendritic spine. We merged our previously published biochemical model of calcium dynamics and lipid signaling in the Purkinje neuron, developed in the Virtual Cell modeling and simulation environment, with an electrophysiological model based on a Purkinje neuron model available in NEURON. A novel reduction method was applied to the Purkinje neuron geometry to obtain a model with fewer compartments that is tractable in Virtual Cell. Most of the dendritic tree was subject to reduction, but we retained the neuron’s explicit electrical and geometric features along a specified path from spine to soma. Further, unlike previous simplification methods, the dendrites that branch off along the preserved explicit path are retained as reduced branches. We conserved axial resistivity and adjusted passive properties and active channel conductances for the reduction in surface area, and cytosolic calcium for the reduction in volume. Rallpacks are used to validate the reduction algorithm and show that it can be generalized to other complex neuronal geometries. For the Purkinje cell, we found that current injections at the soma were able to produce similar trains of action potentials and membrane potential propagation in the full and reduced models in NEURON; the reduced model produces identical spiking patterns in NEURON and Virtual Cell. Importantly, our reduced model can simulate communication between the soma and a distal spine; an alpha function applied at the spine to represent synaptic stimulation gave similar results in the full and reduced models for potential changes associated with both the spine and the soma. Finally, we combined phosphoinositol signaling and electrophysiology in the reduced model in Virtual Cell. Thus, a strategy has been developed to combine electrophysiology and biochemistry as a step toward merging neuronal and systems biology modeling. Abstract The advent of techniques with the ability to scan massive changes in cellular makeup (genomics, proteomics, etc.) has revealed the compelling need for analytical methods to interpret and make sense of those changes. Computational models built on sound physicochemical mechanistic basis are unavoidable at the time of integrating, interpreting, and simulating highthroughput experimental data. Another powerful role of computational models is predicting new behavior provided they are adequately validated.Mitochondrial energy transduction has been traditionally studied with thermodynamic models. More recently, kinetic or thermokinetic models have been proposed, leading the path toward an understanding of the control and regulation of mitochondrial energy metabolism and its interaction with cytoplasmic and other compartments. In this work, we outline the methods, stepbystep, that should be followed to build a computational model of mitochondrial energetics in isolation or integrated to a network of cellular processes. Depending on the question addressed by the modeler, the methodology explained herein can be applied with different levels of detail, from the mitochondrial energy producing machinery in a network of cellular processes to the dynamics of a single enzyme during its catalytic cycle. Abstract The voltage and time dependence of ion channels can be regulated, notably by phosphorylation, interaction with phospholipids, and binding to auxiliary subunits. Many parameter variation studies have set conductance densities free while leaving kinetic channel properties fixed as the experimental constraints on the latter are usually better than on the former. Because individual cells can tightly regulate their ion channel properties, we suggest that kinetic parameters may be profitably set free during model optimization in order to both improve matches to data and refine kinetic parameters. To this end, we analyzed the parameter optimization of reduced models of three electrophysiologically characterized and morphologically reconstructed globus pallidus neurons. We performed two automated searches with different types of free parameters. First, conductance density parameters were set free. Even the best resulting models exhibited unavoidable problems which were due to limitations in our channel kinetics. We next set channel kinetics free for the optimized density matches and obtained significantly improved model performance. Some kinetic parameters consistently shifted to similar new values in multiple runs across three models, suggesting the possibility for tailored improvements to channel models. These results suggest that optimized channel kinetics can improve model matches to experimental voltage traces, particularly for channels characterized under different experimental conditions than recorded data to be matched by a model. The resulting shifts in channel kinetics from the original template provide valuable guidance for future experimental efforts to determine the detailed kinetics of channel isoforms and possible modulated states in particular types of neurons. Abstract Electrical synapses continuously transfer signals bidirectionally from one cell to another, directly or indirectly via intermediate cells. Electrical synapses are common in many brain structures such as the inferior olive, the subcoeruleus nucleus and the neocortex, between neurons and between glial cells. In the cortex, interneurons have been shown to be electrically coupled and proposed to participate in large, continuous cortical syncytia, as opposed to smaller spatial domains of electrically coupled cells. However, to explore the significance of these findings it is imperative to map the electrical synaptic microcircuits, in analogy with in vitro studies on monosynaptic and disynaptic chemical coupling. Since “walking” from cell to cell over large distances with a glass pipette is challenging, microinjection of (fluorescent) dyes diffusing through gapjunctions remains so far the only method available to decipher such microcircuits even though technical limitations exist. Based on circuit theory, we derive analytical descriptions of the AC electrical coupling in networks of isopotential cells. We then suggest an operative electrophysiological protocol to distinguish between direct electrical connections and connections involving one or more intermediate cells. This method allows inferring the number of intermediate cells, generalizing the conventional coupling coefficient, which provides limited information. We validate our method through computer simulations, theoretical and numerical methods and electrophysiological paired recordings. Abstract Because electrical coupling among the neurons of the brain is much faster than chemical synaptic coupling, it is natural to hypothesize that gap junctions may play a crucial role in mechanisms underlying very fast oscillations (VFOs), i.e., oscillations at more than 80 Hz. There is now a substantial body of experimental and modeling literature supporting this hypothesis. A series of modeling papers, starting with work by Roger Traub and collaborators, have suggested that VFOs may arise from expanding waves propagating through an “axonal plexus”, a large random network of electrically coupled axons. Traub et al. also proposed a cellular automaton (CA) model to study the mechanisms of VFOs in the axonal plexus. In this model, the expanding waves take the appearance of topologically circular “target patterns”. Random external stimuli initiate each wave. We therefore call this kind of VFO “externally driven”. Using a computational model, we show that an axonal plexus can also exhibit a second, distinctly different kind of VFO in a wide parameter range. These VFOs arise from activity propagating around cycles in the network. Once triggered, they persist without any source of excitation. With idealized, regular connectivity, they take the appearance of spiral waves. We call these VFOs “reentrant”. The behavior of the axonal plexus depends on the reliability with which action potentials propagate from one axon to the next, which, in turn, depends on the somatic membrane potential V s and the gap junction conductance g gj . To study these dependencies, we impose a fixed value of V s , then study the effects of varying V s and g gj . Not surprisingly, propagation becomes more reliable with rising V s and g gj . Externally driven VFOs occur when V s and g gj are so high that propagation never fails. For lower V s or g gj , propagation is nearly reliable, but fails in rare circumstances. Surprisingly, the parameter regime where this occurs is fairly large. Even a single propagation failure can trigger reentrant VFOs in this regime. Lowering V s and g gj further, one finds a third parameter regime in which propagation is unreliable, and no VFOs arise. We analyze these three parameter regimes by means of computations using model networks adapted from Traub et al., as well as much smaller model networks. Abstract Research with barn owls suggested that sound source location is represented topographically in the brain by an array of neurons each tuned to a narrow range of locations. However, research with smallheaded mammals has offered an alternative view in which location is represented by the balance of activity in two opponent channels broadly tuned to the left and right auditory space. Both channels may be present in each auditory cortex, although the channel representing contralateral space may be dominant. Recent studies have suggested that opponent channel coding of space may also apply in humans, although these studies have used a restricted set of spatial cues or probed a restricted set of spatial locations, and there have been contradictory reports as to the relative dominance of the ipsilateral and contralateral channels in each cortex. The current study used electroencephalography (EEG) in conjunction with sound field stimulus presentation to address these issues and to inform the development of an explicit computational model of human sound source localization. Neural responses were compatible with the opponent channel account of sound source localization and with contralateral channel dominance in the left, but not the right, auditory cortex. A computational opponent channel model reproduced every important aspect of the EEG data and allowed inferences about the width of tuning in the spatial channels. Moreover, the model predicted the oftreported decrease in spatial acuity measured psychophysically with increasing reference azimuth. Predictions of spatial acuity closely matched those measured psychophysically by previous authors. Abstract Calretinin is thought to be the main endogenous calcium buffer in cerebellar granule cells (GrCs). However, little is known about the impact of cooperative Ca 2+ binding to calretinin on highly localized and more global (regional) Ca 2+ signals in these cells. Using numerical simulations, we show that an essential property of calretinin is a delayed equilibration with Ca 2+ . Therefore, the amount of Ca 2+ , which calretinin can accumulate with respect to equilibrium levels, depends on stimulus conditions. Based on our simulations of buffered Ca 2+ diffusion near a single Ca 2+ channel or a large cluster of Ca 2+ channels and previous experimental findings that 150 μM 1,2bis(oaminophenoxy) ethane N , N , N ′, N ′tetraacetic acid (BAPTA) and endogenous calretinin have similar effects on GrC excitability, we estimated the concentration of mobile calretinin in GrCs in the range of 0.7–1.2 mM. Our results suggest that this estimate can provide a starting point for further analysis. We find that calretinin prominently reduces the action potential associated increase in cytosolic free Ca 2+ concentration ([Ca 2+ ] i ) even at a distance of 30 nm from a single Ca 2+ channel. In spite of a buildup of residual Ca 2+ , it maintains almost constant maximal [Ca 2+ ] i levels during repetitive channel openings with a frequency less than 80 Hz. This occurs because of accelerated Ca 2+ binding as calretinin binds more Ca 2+ . Unlike the buffering of high Ca 2+ levels within Ca 2+ nano/microdomains sensed by large conductance Ca 2+ activated K + channels, the buffering of regional Ca 2+ signals by calretinin can never be mimicked by certain concentration of BAPTA under all different experimental conditions. Abstract The field of Computational Systems Neurobiology is maturing quickly. If one wants it to fulfil its central role in the new Integrative Neurobiology, the reuse of quantitative models needs to be facilitated. The community has to develop standards and guidelines in order to maximise the diffusion of its scientific production, but also to render it more trustworthy. In the recent years, various projects tackled the problems of the syntax and semantics of quantitative models. More recently the international initiative BioModels.net launched three projects: (1) MIRIAM is a standard to curate and annotate models, in order to facilitate their reuse. (2) The Systems Biology Ontology is a set of controlled vocabularies aimed to be used in conjunction with models, in order to characterise their components. (3) BioModels Database is a resource that allows biologists to store, search and retrieve published mathematical models of biological interests. We expect that those resources, together with the use of formal languages such as SBML, will support the fruitful exchange and reuse of quantitative models. Abstract Understanding the direction and quantity of information flowing in neuronal networks is a fundamental problem in neuroscience. Brains and neuronal networks must at the same time store information about the world and react to information in the world. We sought to measure how the activity of the network alters information flow from inputs to output patterns. Using neocortical column neuronal network simulations, we demonstrated that networks with greater internal connectivity reduced input/output correlations from excitatory synapses and decreased negative correlations from inhibitory synapses, measured by Kendall’s τ correlation. Both of these changes were associated with reduction in information flow, measured by normalized transfer entropy ( n TE). Information handling by the network reflected the degree of internal connectivity. With no internal connectivity, the feedforward network transformed inputs through nonlinear summation and thresholding. With greater connectivity strength, the recurrent network translated activity and information due to contribution of activity from intrinsic network dynamics. This dynamic contribution amounts to added information drawn from that stored in the network. At still higher internal synaptic strength, the network corrupted the external information, producing a state where little external information came through. The association of increased information retrieved from the network with increased gamma power supports the notion of gamma oscillations playing a role in information processing. Abstract Intracellular Ca 2+ concentrations play a crucial role in the physiological interaction between Ca 2+ channels and Ca 2+ activated K + channels. The commonly used model, a Ca 2+ pool with a short relaxation time, fails to simulate interactions occurring at multiple time scales. On the other hand, detailed computational models including various Ca 2+ buffers and pumps can result in large computational cost due to radial diffusion in large compartments, which may be undesirable when simulating morphologically detailed Purkinje cell models. We present a method using a compensating mechanism to replace radial diffusion and compared the dynamics of different Ca 2+ buffering models during generation of a dendritic Ca 2+ spike in a single compartment model of a PC dendritic segment with Ca 2+ channels of P and Ttype and Ca 2+ activated K + channels of BK and SKtype. The Ca 2+ dynamics models used are (1) a single Ca 2+ pool; (2) two Ca 2+ pools, respectively, for the fast and slow transients; (3) detailed Ca 2+ dynamics with buffers, pump, and diffusion; and (4) detailed Ca 2+ dynamics with buffers, pump, and diffusion compensation. Our results show that detailed Ca 2+ dynamics models have significantly better control over Ca 2+ activated K + channels and lead to physiologically more realistic simulations of Ca 2+ spikes and bursting. Furthermore, the compensating mechanism largely eliminates the effect of removing diffusion from the model on Ca 2+ dynamics over multiple time scales. Abstract This paper describes the capabilities of DISCO, an extensible approach that supports integrative Webbased information dissemination. DISCO is a component of the Neuroscience Information Framework (NIF), an NIH Neuroscience Blueprint initiative that facilitates integrated access to diverse neuroscience resources via the Internet. DISCO facilitates the automated maintenance of several distinct capabilities using a collection of files 1) that are maintained locally by the developers of participating neuroscience resources and 2) that are “harvested” on a regular basis by a central DISCO server. This approach allows central NIF capabilities to be updated as each resource’s content changes over time. DISCO currently supports the following capabilities: 1) resource descriptions, 2) “LinkOut” to a resource’s data items from NCBI Entrez resources such as PubMed, 3) Webbased interoperation with a resource, 4) sharing a resource’s lexicon and ontology, 5) sharing a resource’s database schema, and 6) participation by the resource in neurosciencerelated RSS news dissemination. The developers of a resource are free to choose which DISCO capabilities their resource will participate in. Although DISCO is used by NIF to facilitate neuroscience data integration, its capabilities have general applicability to other areas of research. Abstract Spiking neural network simulations incorporating variable transmission delays require synaptic events to be scheduled prior to delivery. Conventional methods have memory requirements that scale with the total number of synapses in a network. We introduce novel scheduling algorithms for both discrete and continuous event delivery, where the memory requirement scales instead with the number of neurons. Superior algorithmic performance is demonstrated using largescale, benchmarking network simulations. More extreme swings of the South Pacific convergence zone due to greenhouse warming Nature The South Pacific convergence zone (SPCZ) is the Southern Hemisphere’s most expansive and persistent rain band, extending from the equatorial western Pacific Ocean southeastward towards French Polynesia. Owing to its strong rainfall gradient, a small displacement in the position of the SPCZ causes drastic changes to hydroclimatic conditions and the frequency of extreme weather events—such as droughts, floods and tropical cyclones—experienced by vulnerable island countries in the region. The SPCZ position varies from its climatological mean location with the El Niño/Southern Oscillation (ENSO), moving a few degrees northward during moderate El Niño events and southward during La Niña events. During strong El Niño events, however, the SPCZ undergoes an extreme swing—by up to ten degrees of latitude toward the Equator—and collapses to a more zonally oriented structure with commensurately severe weather impacts. Understanding changes in the characteristics of the SPCZ in a changing climate is therefore of broad scientific and socioeconomic interest. Here we present climate modelling evidence for a near doubling in the occurrences of zonal SPCZ events between the periods 1891–1990 and 1991–2090 in response to greenhouse warming, even in the absence of a consensus on how ENSO will change. We estimate the increase in zonal SPCZ events from an aggregation of the climate models in the Coupled Model Intercomparison Project phases 3 and 5 (CMIP3 and CMIP5) multi-model database that are able to simulate such events. The change is caused by a projected enhanced equatorial warming in the Pacific and may lead to more frequent occurrences of extreme events across the Pacific island nations most affected by zonal SPCZ events. Activated boron nitride as an effective adsorbent for metal ions and organic pollutants Scientific Reports Novel activated boron nitride (BN) as an effective adsorbent for pollutants in water and air has been reported in the present work. The activated BN was synthesized by a simple structure-directed method that enabled us to control the surface area, pore volume, crystal defects and surface groups. The obtained BN exhibits an super high surface area of 2078 m2/g, a large pore volume of 1.66 cm3/g and a special multimodal microporous/mesoporous structure located at ~ 1.3, ~ 2.7, and ~ 3.9 nm, respectively. More importantly, the novel activated BN exhibits an excellent adsorption performance for various metal ions (Cr3+, Co2+, Ni2+, Ce3+, Pb2+) and organic pollutants (tetracycline, methyl orange and congo red) in water, as well as volatile organic compounds (benzene) in air. The excellent reusability of the activated BN has also been confirmed. All the features render the activated BN a promising material suitable for environmental remediation. Neurofitter: a parameter tuning package for a wide range of electrophysiological neuron models. Frontiers in neuroinformatics The increase in available computational power and the higher quality of experimental recordings have turned the tuning of neuron model parameters into a problem that can be solved by automatic global optimization algorithms. Neurofitter is a software tool that interfaces existing neural simulation software and sophisticated optimization algorithms with a new way to compute the error measure. This error measure represents how well a given parameter set is able to reproduce the experimental data. It is based on the phase-plane trajectory density method, which is insensitive to small phase differences between model and data. Neurofitter enables the effortless combination of many different time-dependent data traces into the error measure, allowing the neuroscientist to focus on what are the seminal properties of the model.We show results obtained by applying Neurofitter to a simple single compartmental model and a complex multi-compartmental Purkinje cell (PC) model. These examples show that the method is able to solve a variety of tuning problems and demonstrate details of its practical application. Sensory feedback, error correction, and remapping in a multiple oscillator model of place-cell activity. Frontiers in computational neuroscience Mammals navigate by integrating self-motion signals ("path integration") and occasionally fixing on familiar environmental landmarks. The rat hippocampus is a model system of spatial representation in which place cells are thought to integrate both sensory and spatial information from entorhinal cortex. The localized firing fields of hippocampal place cells and entorhinal grid-cells demonstrate a phase relationship with the local theta (6-10 Hz) rhythm that may be a temporal signature of path integration. However, encoding self-motion in the phase of theta oscillations requires high temporal precision and is susceptible to idiothetic noise, neuronal variability, and a changing environment. We present a model based on oscillatory interference theory, previously studied in the context of grid cells, in which transient temporal synchronization among a pool of path-integrating theta oscillators produces hippocampal-like place fields. We hypothesize that a spatiotemporally extended sensory interaction with external cues modulates feedback to the theta oscillators. We implement a form of this cue-driven feedback and show that it can retrieve fixed points in the phase code of position. A single cue can smoothly reset oscillator phases to correct for both systematic errors and continuous noise in path integration. Further, simulations in which local and global cues are rotated against each other reveal a phase-code mechanism in which conflicting cue arrangements can reproduce experimentally observed distributions of "partial remapping" responses. This abstract model demonstrates that phase-code feedback can provide stability to the temporal coding of position during navigation and may contribute to the context-dependence of hippocampal spatial representations. While the anatomical substrates of these processes have not been fully characterized, our findings suggest several signatures that can be evaluated in future experiments.