1,071 research outputs found

    GeNN: a code generation framework for accelerated brain simulations

    Get PDF
    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/

    SpikingLab: modelling agents controlled by Spiking Neural Networks in Netlogo

    Get PDF
    The scientific interest attracted by Spiking Neural Networks (SNN) has lead to the development of tools for the simulation and study of neuronal dynamics ranging from phenomenological models to the more sophisticated and biologically accurate Hodgkin-and-Huxley-based and multi-compartmental models. However, despite the multiple features offered by neural modelling tools, their integration with environments for the simulation of robots and agents can be challenging and time consuming. The implementation of artificial neural circuits to control robots generally involves the following tasks: (1) understanding the simulation tools, (2) creating the neural circuit in the neural simulator, (3) linking the simulated neural circuit with the environment of the agent and (4) programming the appropriate interface in the robot or agent to use the neural controller. The accomplishment of the above-mentioned tasks can be challenging, especially for undergraduate students or novice researchers. This paper presents an alternative tool which facilitates the simulation of simple SNN circuits using the multi-agent simulation and the programming environment Netlogo (educational software that simplifies the study and experimentation of complex systems). The engine proposed and implemented in Netlogo for the simulation of a functional model of SNN is a simplification of integrate and fire (I&F) models. The characteristics of the engine (including neuronal dynamics, STDP learning and synaptic delay) are demonstrated through the implementation of an agent representing an artificial insect controlled by a simple neural circuit. The setup of the experiment and its outcomes are described in this work

    Disentangling astroglial physiology with a realistic cell model in silico

    Get PDF
    Electrically non-excitable astroglia take up neurotransmitters, buffer extracellular K+ and generate Ca2+ signals that release molecular regulators of neural circuitry. The underlying machinery remains enigmatic, mainly because the sponge-like astrocyte morphology has been difficult to access experimentally or explore theoretically. Here, we systematically incorporate multi-scale, tri-dimensional astroglial architecture into a realistic multi-compartmental cell model, which we constrain by empirical tests and integrate into the NEURON computational biophysical environment. This approach is implemented as a flexible astrocyte-model builder ASTRO. As a proof-of-concept, we explore an in silico astrocyte to evaluate basic cell physiology features inaccessible experimentally. Our simulations suggest that currents generated by glutamate transporters or K+ channels have negligible distant effects on membrane voltage and that individual astrocytes can successfully handle extracellular K+ hotspots. We show how intracellular Ca2+ buffers affect Ca2+ waves and why the classical Ca2+ sparks-and-puffs mechanism is theoretically compatible with common readouts of astroglial Ca2+ imaging

    A Comprehensive Workflow for General-Purpose Neural Modeling with Highly Configurable Neuromorphic Hardware Systems

    Full text link
    In this paper we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware-experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results

    The Eyes Have It: Sex and Sexual Orientation Differences in Pupil Dilation Patterns

    Get PDF
    Recent research suggests profound sex and sexual orientation differences in sexual response. These results, however, are based on measures of genital arousal, which have potential limitations such as volunteer bias and differential measures for the sexes. The present study introduces a measure less affected by these limitations. We assessed the pupil dilation of 325 men and women of various sexual orientations to male and female erotic stimuli. Results supported hypotheses. In general, self-reported sexual orientation corresponded with pupil dilation to men and women. Among men, substantial dilation to both sexes was most common in bisexual-identified men. In contrast, among women, substantial dilation to both sexes was most common in heterosexual-identified women. Possible reasons for these differences are discussed. Because the measure of pupil dilation is less invasive than previous measures of sexual response, it allows for studying diverse age and cultural populations, usually not included in sexuality research

    The what and where of adding channel noise to the Hodgkin-Huxley equations

    Get PDF
    One of the most celebrated successes in computational biology is the Hodgkin-Huxley framework for modeling electrically active cells. This framework, expressed through a set of differential equations, synthesizes the impact of ionic currents on a cell's voltage -- and the highly nonlinear impact of that voltage back on the currents themselves -- into the rapid push and pull of the action potential. Latter studies confirmed that these cellular dynamics are orchestrated by individual ion channels, whose conformational changes regulate the conductance of each ionic current. Thus, kinetic equations familiar from physical chemistry are the natural setting for describing conductances; for small-to-moderate numbers of channels, these will predict fluctuations in conductances and stochasticity in the resulting action potentials. At first glance, the kinetic equations provide a far more complex (and higher-dimensional) description than the original Hodgkin-Huxley equations. This has prompted more than a decade of efforts to capture channel fluctuations with noise terms added to the Hodgkin-Huxley equations. Many of these approaches, while intuitively appealing, produce quantitative errors when compared to kinetic equations; others, as only very recently demonstrated, are both accurate and relatively simple. We review what works, what doesn't, and why, seeking to build a bridge to well-established results for the deterministic Hodgkin-Huxley equations. As such, we hope that this review will speed emerging studies of how channel noise modulates electrophysiological dynamics and function. We supply user-friendly Matlab simulation code of these stochastic versions of the Hodgkin-Huxley equations on the ModelDB website (accession number 138950) and http://www.amath.washington.edu/~etsb/tutorials.html.Comment: 14 pages, 3 figures, review articl

    Intrinsic gain modulation and adaptive neural coding

    Get PDF
    In many cases, the computation of a neural system can be reduced to a receptive field, or a set of linear filters, and a thresholding function, or gain curve, which determines the firing probability; this is known as a linear/nonlinear model. In some forms of sensory adaptation, these linear filters and gain curve adjust very rapidly to changes in the variance of a randomly varying driving input. An apparently similar but previously unrelated issue is the observation of gain control by background noise in cortical neurons: the slope of the firing rate vs current (f-I) curve changes with the variance of background random input. Here, we show a direct correspondence between these two observations by relating variance-dependent changes in the gain of f-I curves to characteristics of the changing empirical linear/nonlinear model obtained by sampling. In the case that the underlying system is fixed, we derive relationships relating the change of the gain with respect to both mean and variance with the receptive fields derived from reverse correlation on a white noise stimulus. Using two conductance-based model neurons that display distinct gain modulation properties through a simple change in parameters, we show that coding properties of both these models quantitatively satisfy the predicted relationships. Our results describe how both variance-dependent gain modulation and adaptive neural computation result from intrinsic nonlinearity.Comment: 24 pages, 4 figures, 1 supporting informatio

    Central synapses release a resource-efficient amount of glutamate.

    Get PDF
    Why synapses release a certain amount of neurotransmitter is poorly understood. We combined patch-clamp electrophysiology with computer simulations to estimate how much glutamate is discharged at two distinct central synapses of the rat. We found that, regardless of some uncertainty over synaptic microenvironment, synapses generate the maximal current per released glutamate molecule while maximizing signal information content. Our result suggests that synapses operate on a principle of resource optimization

    Cannabinoid-mediated short-term plasticity in hippocampus

    Get PDF
    Endocannabinoids modulate both excitatory and inhibitory neurotransmission in hippocampus via activation of pre-synaptic cannabinoid receptors. Here, we present a model for cannabinoid mediated short-term depression of excitation (DSE) based on our recently developed model for the equivalent phenomenon of suppressing inhibition (DSI). Furthermore, we derive a simplified formulation of the calcium-mediated endocannabinoid synthesis that underlies short-term modulation of neurotransmission in hippocampus. The simplified model describes cannabinoid-mediated short-term modulation of both hippocampal inhibition and excitation and is ideally suited for large network studies. Moreover, the implementation of the simplified DSI/DSE model provides predictions on how both phenomena are modulated by the magnitude of the pre-synaptic cell's activity. In addition we demonstrate the role of DSE in shaping the post-synaptic cell's firing behaviour qualitatively and quantitatively in dependence on eCB availability and the pre-synaptic cell's activity. Finally, we explore under which conditions the combination of DSI and DSE can temporarily shift the fine balance between excitation and inhibition. This highlights a mechanism by which eCBs might act in a neuro-protective manner during high neural activity

    Spatially distributed dendritic resonance selectively filters synaptic input

    Get PDF
    © 2014 Laudanski et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.An important task performed by a neuron is the selection of relevant inputs from among thousands of synapses impinging on the dendritic tree. Synaptic plasticity enables this by strenghtening a subset of synapses that are, presumably, functionally relevant to the neuron. A different selection mechanism exploits the resonance of the dendritic membranes to preferentially filter synaptic inputs based on their temporal rates. A widely held view is that a neuron has one resonant frequency and thus can pass through one rate. Here we demonstrate through mathematical analyses and numerical simulations that dendritic resonance is inevitably a spatially distributed property; and therefore the resonance frequency varies along the dendrites, and thus endows neurons with a powerful spatiotemporal selection mechanism that is sensitive both to the dendritic location and the temporal structure of the incoming synaptic inputs.Peer reviewe
    corecore