Font Size: a A A

Research On Memristor Forgetting Effect Modeling And Memristive Spiking Neural Networks

Posted on:2020-05-08Degree:DoctorType:Dissertation
Country:ChinaCandidate:E R ZhouFull Text:PDF
GTID:1488306548992069Subject:Electronic Science and Technology
Abstract/Summary:PDF Full Text Request
With the development of intelligent society and the popularity of the Internet of Things,the conventional von Neumann architecture can not meet the needs of highperformance and low-power computing under the constraints of storage wall and power consumption.To compensate for the shortcomings of the conventional von Neumann architecture,researchers proposed a new model of in-memory computing,including neuromorphic computing and logical computing.The emergence of memristors further enhances the feasibility of in-memory computing,and in-memory computing based on memristors are widely researched.Memristive spiking neural networks,as a kind of memristorbased neuromorphic computing,have the advantages of biological plausibility and ultralow power consumption,and are broadly studied now.However,the research of memristive spiking neural networks is still in its infancy.Device fabrication and modeling,circuit design and architecture design need to be studied in depth.This dissertation studies two problems in the study of memristive spiking neural networks: the memristor models cannot describe the forgetting effect well and the architecture is hardware-unfriendly.The main work and contribution of this dissertation are described as follows:(1)We propose a new forgetting effect memristor(FEM)model.Furthermore,a general forgetting effect describing method(FEDM)is proposed based on the FEM model.We analyze shortcomings of existing models,and set the values of the window function at boundaries to a specific value in the FEM model,thus avoiding the boundary problems in existing models.The FEM model is then developed into a general method that employs the combined result of dopant drift and dopant diffusion to determine the value of the window function and the change rate of the inner state.The FEDM can be used in models based on different mechanisms,and the models improved by the FEDM solve the problems in previous models.The improved models can describe forgetting effect better and can be better applied to application development of memristors.(2)A hardware-friendly supervised memristive spiking neural network is proposed and the corresponding supervised learning algorithm is designed.The proposed network uses time-based encoding scheme,and the spike is simplified to a step signal,which reduces the hardware complexity of CMOS neurons.There is only one sub-connection between each group of pre-and post-synaptic neurons,thus overcoming the problem of area-inefficient in existing literatures.The proposed supervised learning algorithm can successfully train the network and obtains good results in several data sets such as Iris and WBC.At the same time,simulation results show that forgetting effect has a significant impact on the performance of the network,which means memristors without or with weak forgetting effect should be chosen in practical applications.(3)We propose a hardware-friendly shallow memristive spiking neural network.At the same time,a learning algorithm based on spike-timing dependent plasticity(STDP)is designed for the network.The network employs time-based encoding scheme,and all neurons in the network can generate at most one spike during a processing period.What's more,the spike is reduced to a step signal.These means the neuron complexity is greatly reduced.The complexity of the overall network is reduced by not using complex lateral inhibition strategy or adaptive threshold strategy.A simple voting circuit is used as the classifier to reduce the complexity.Furthermore,a pruning algorithm is designed for the proposed network.The pruning algorithm reduces the architecture complexity by pruning similar feature learning neurons,confused feature learning neurons and nonsignificant synapses.Simulation results show that the network has a good performance on the MNIST dataset,which outperforms other unsupervised SNNs that use time-based encoding schemes.Simulation results also show that forgetting effect has a great influence on the performance of the network,hence,memristors without or with weak forgetting effect should be chosen in practical applications.The pruning algorithm is verified that it can reduce the number of feature learning neurons and synapses distinctly and improve the performance of the network.(4)A deep hardware-friendly unsupervised memristive spiking neural network is proposed based on the shallow memristive spiking neural network,and the corresponding STDP learning algorithm is also presented.The deep network is properly adjusted to make it both hardware-friendly and achieve good performance.The hard lateral inhibition mechanism is only employed behind pooling layers,which reduces the complexity of the network structure.Pooling layers use the earliest-pooling strategy,which reduces the hardware complexity of the pooling layer neurons.Simulation results show that the network can successfully learn features on different levels.The number of feature kernels in different convolutional layers has evident impact on performance of netwoks.For 1CFC and 2C-FC networks,the suitable number of feature cores is 4 and 4-30.In addition,the number of fully-connected neurons,the threshold voltage of neurons during detection,and the number of voting neurons also influence performance of networks.Simulation results again verify that forgetting effect of memristors has a great impact on performance of networks,which means memristors without or with weak forgetting effect should be chosen in practical applications.
Keywords/Search Tags:Memristor, Forgetting effect, Memristive spiking neural networks, Supervised learning, Unsupervised learning
PDF Full Text Request
Related items