Font Size: a A A

Research On Novel Computational Models And Learning Algorithms Of Spiking Neural Networks

Posted on:2024-07-16Degree:DoctorType:Dissertation
Country:ChinaCandidate:X L LuoFull Text:PDF
GTID:1528307079950709Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Drawing on the hierarchical structure of information processing in the brain,deep learning(DL)has made great breakthroughs and wide applications in the field of intelligent applications.However,the existing DL model requires vast computing resources and cannot be deployed on a large scale in power-and cost-sensitive applications.In addition,the existing DL models are far from the human intelligence.In order to solve the above problems,the brain-inspired spiking neural networks(SNNs)provide a new way of thinking.SNNs use discrete binary spikes to represent information,which have rich neurodynamic characteristics,powerful spatio-temporal information processing ability,ultra-low power consumption and ultra-low latency when running on neuromorphic chips.All of these make it a huge potential for innovation.However,due to the complex spatio-temporal dependence and the non-differentiable properties of spike firing,SNNs are still lack of efficient network models and learning algorithms,which limits its practical application performance.So this dissertation carries out research from these two aspects,and the main innovations are as follows:(1)In view of the low efficiency of sequence learning in feedforward SNNs,a membrane voltage-driven sequence learning algorithm FE-Learn(First Error Learning)is proposed.Unlike general sequential learning algorithms that need to accurately reproduce the desired spike train,FE-Learn only needs to fire spike in a small window near each desired spike.This relaxation of the target output greatly improves its robustness.In addition,FELearn specifically adjusts the membrane voltage of the first error spike in each round of learning,so that the neuron can quickly fire the desired spike train.Meanwhile,FE-Learn introduces synaptic delay plasticity into parameter learning,improving the learning ability under sparse input conditions,and the further extension of the algorithm to multi-layer networks also expands its practical application range.Experiments show that FE-Learn has higher learning efficiency and better robustness than other sequential learning algorithms.(2)For the Temporal Credit-Assignment(TCA)problem that has been studied for a long time in the field of neuroscience and machine learning,an efficient threshold-driven plasticity(ETDP)algorithm for SNN is proposed.ETDP uses the sum of the desired output spike counts corresponding to multiple effective clues as the label,and updates the weight through the critical threshold in the spike-threshold mapping relationship.Experiments show that ETDP can detect and bridge the gap between the occurrence of sensory signals and the arrival of feedback signals,assign the aggregate label to valid clues embedded in background activities,and effectively deal with the TCA problem.In addition,ETDP adopts a strategy to prevent gradient explosion to improve the algorithm performance.Experimental results show that ETDP has higher learning efficiency than other thresholddriven aggregate-label learning algorithms.(3)Aiming at the problem of spatio-temporal sequence memory with variable time intervals,an SNN model inspired by biological neural columns is proposed,named CSTM(Column-based Spatial-Temporal Memory).Most of the existing sequence memory models only focus on the order of sequence items,without considering the time interval between items,so they cannot handle sequences with variable time intervals in reality.Different from mainstream SNN models,CSTM arranges multiple neurons into columns,and each column responds to an input stimuli.Then the synaptic connections between neurons of different columns concatenate the sequence items,and synaptic transmission delays characterize the element time intervals.Through such a special structure,CSTM realizes the associative memory of spatio-temporal sequences with variable time intervals.Experimental results show that after only one learning on the input,CSTM can memorize sequences containing thousands of items.(4)In order to achieve fast real-time memory of a large number of sequences,an SNN online human-like sequential memory model(HLSM)is proposed.The brain can remember everyday experiences in real time and flexibly update memories through forgetting.However,most of the current memory models only rely on machine learning methods and artificial intelligence technology,and lack the simulation of the brain’s memory mechanism.HLSM is based on column structure,and it has sparse distribution of information on column,neuron and time scales,which can greatly improve its memory capacity.HLSM also introduces neural oscillations to realize the rhythm control of sequence memory and online learning and prediction of streaming data.In addition,HLSM also dynamically manages the stored memory through a forgetting mechanism.Experiments show that,compared with other memory models,HLSM has higher memory efficiency and memory capacity,and can quickly adapt to changes in streaming data.
Keywords/Search Tags:Spiking Neural Networks, Sequential Learning, Aggregate-Label Learning, Sequential Memory Model, Sequential Prediction Model
PDF Full Text Request
Related items