Font Size: a A A

Framework for modeling spiking neural networks on high-performance graphics processors

Posted on:2011-09-07Degree:Ph.DType:Thesis
University:University of California, IrvineCandidate:Moorkanikara Nageswaran, JayramFull Text:PDF
GTID:2448390002964312Subject:Biology
Abstract/Summary:
Spiking neural network (SNN) models are emerging as a plausible paradigm for characterizing neural dynamics in the cerebral cortex. Traditionally these SNN models were simulated on large-scale clusters, super-computers, or on dedicated VLSI architectures. Alternatively, Graphics Processing Units (GPUs) can provide a low-cost, programmable, and high-performance computing platform for simulation of SNNs. This thesis proposes a systematic framework for modeling and simulation of biologically realistic large-scale spiking neural networks on high-performance graphics processors.;The first part of the framework consists of a high-level specification to quickly build arbitrary, large-scale spiking neural network for different applications. Various features have been included in the specification to capture the properties of biologically realistic neurons and synaptic plasticity, different types of connection topologies between neuronal groups, and techniques to probe and capture the network state. The high-level SNN specification is converted to a sparse adjacency matrix representation and mapped on the GPUs. Further, we present a collection of new techniques related to parallelism extraction, mapping of irregular communication, and compact adjacency matrix representation for effective simulation of SNNs on GPUs. These optimization techniques enable real-time simulation of GPU accelerated SNN models with 100K neurons and 10 Million synaptic connections.;Another challenging problem faced by computational neuroscientists is the tuning and selection of parameter values to operate the network in a stable firing regime. This problem is exacerbated due to simulation of increasingly complicated network models that exhibit non-linear dynamics. The last part of the generic framework proposes an evolutionary approach to automate parameter tuning in spiking neural networks. The evolutionary approach generates a population of SNN with different parameters for simulations. At the end of each simulation, the user-specified fitness condition is evaluated to determine the effectiveness of different members of the population. Using CPU-based evolutionary tuning, SNN models can be tuned at least 10x faster than full parameter sweep for networks of size 1000 neurons with 5 parameters. Further improvement in performance can be achieved by GPU accelerarated simulation and fitness evaluation of the entire SNN populations. The GPU-based evolutionary tuning technique is shown to be 6x to 20x faster than CPU-based evolutionary parameter tuning for networks of different sizes.;We applied the entire framework for the simulation of different spike-based computation applications. In one of the applications we interfaced a 128x128 pixel spike-based neuromorphic sensor to a spiking neural network running on a GPU for real-time convolution-based feature extraction. The work described in this thesis should be useful to both computational neuroscientists, who can use the information for large-scale SNN simulation, as well as computer scientist, who can get insights into high-performance computational challenges for simulating brain-inspired models.
Keywords/Search Tags:Spiking neural, SNN, Neural network, Models, High-performance, Simulation, Framework, Graphics
Related items