Font Size: a A A

Eisodic Memory In Neural Networks Based On Synaptic Computing

Posted on:2010-09-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:M XiaFull Text:PDF
GTID:1118360302980614Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
Episodic information processing, for instance the episodic memory, plays an important role on many functions of brain. Episodic memory refers to the ability to encode and represent the temporal order of discrete elements occurring in a sequence. At the neural level, reproducible sequential neural activity has been shown to be crucial in a variety of cases, such as processing of sensory information, animal communication, and motor control and coordination. All these intriguing experimental observations pose the problem about how to generate robust sequences of neural activity. Many of the behaviors that we produce are sequential in nature.This work makes an intensive study of the episodic memory basing on the synaptic computing in neural network. The influence of dynamic synapses, chaotic neurons, nonlinear function, and partially-connected structure for the performance of neural network are discussed. We research further the recurrent sequence behavior of patterns transfer. Then, the mechanism and the connection condition for recurrent sequence behavior are investigated. We further research the stability for patterns transfer in the network. Thus, the main research contents and innovative points are the follows:(1) Associative memory with dynamic synapses nonlinear function constitutionFirstly, we introduce the dynamic synapses and nonlinear function into the associate memory model. Neurophysiological experiments show that the strength of synaptic connections between two neurons can undergo substantial changes on a short time scale. We embed dynamic synapses into artificial neural networks. Furthermore, nonlinear function constitution, a new method against spurious state, is proposed, which improves the conventional Hebbian learning rule with linear outer product method. Simulation results show that our methods can effectively increase the ability of error tolerance: furthermore, associative memory of neural network with the new method can both enlarge attractive basin and increase storage capacity.(2) Performance of associative memory using partially-connected networksFrom both a neurobiological viewpoint and an implementation perspective, the neuron network is partially-connected. In order to imitate the characteristic of the real neural system, the network with nornal distribution connection is proposed. We investigate the effect of neuron's mean degree for basin of attraction and storage capacity, and we further research how the network's structure influences the performance of the network. Owing to the partial connection in nornal distribution network, the associative memory has feature of higher storage capacity, and the highest storage capacity can be get when the mean degree is 0.64N, which is twice as large as that of traditional Hopfield network. But the basin of attraction is shrunk and the robustness is reduced when the connection mean degree minishes. The comparison of the performance among the ER network, Small-world network and delta distribute network investigates that associative memory network's storage capacity does not simply depend on the mean degree of the neurons, but also on the structure of the connectivity; the basin of attraction only rely on the mean degree of the neurons.(3) Dynamic depression control of chaotic neural networks for associative memoryThe chaotic neural network constructed with chaotic neurons presents complex dynamics and has potential application in the associative dynamics and information processing. However, the states of the chaotic neural network wander around all the stored patterns and cannot be stabilized to one of the stored patterns or a periodic orbit because of the chaotic characteristic of the network, which hampering the application of the chaotic associative dynamics of chaotic neural network to information processing. In this work, a dynamic depression control method imposed on the internal state of neurons for chaotic neural networks is proposed. In this way, the decay parameters and the scaling parameter for the refractoriness are time varying determined by the internal state of neurons. Ascribing to dynamic depression control, chaos is controlled in a self-adaptive manner and no target needs to be specified in advance. Furthermore, the theoretic analysis of dynamic depression control is presented. The numerical simulation proves that the chaos in the chaotic neural network can be controlled with the dynamic depression control, and the neural network can be stabilized to a stored pattern if the control strength parameter is chosen suitable.(4) Episodic memory with dynamic synapses and chaotic neuronsAmong the various dynamics of neural networks, dynamic depression synapses and chaotic behavior have been regarded as the intriguing characteristics of biological neurons. In this paper, episodic memory based on dynamic synapses and chaotic neurons is proposed. The influence of dynamic synapses and chaotic neurons for episodic memory storage capacity, transition time, steady-state period, and stability are investigated. By introducing dynamic synapses into a episodic memory, we found that the episodic memory storage capacity can be enlarged, that the transition time between patterns in the sequence can be shortened, and that the stability of the sequence can be enhanced. Owing to chaotic neurons, the steady-state period in the episodic memory can be adjusted by changing the parameter values of chaotic neurons.(5) Episodic memory with controllable steady-state periodIn the existing episodic memory models using hetero-associations, the steady-state period is changeless in the episodic memory which is not accord with the nature neural system. A novel sparsely-connected neural network for episodic memory with controllable steady-state period is proposed in this study. By introducing a new exponential kernel sampling function and the sampling interval parameter, the steady-state period can be controlled, and the steady-state time steps are equal to the sampling interval parameter. Ascribing to the exponential kernel sampling function, the episodic memory capacity is enlarged compared with the existing episodic memory models. Owning to the sparsely-connected of Gaussian distribution, the model produces the efficient use of synapse resources, but the episodic memory storage capacity is decreased compared with the fully-connected networks. The study also gives a significant result that the networks of different dimensions have the same synapse connection efficiency if they are with the same connection mean degree. Furthermore, we embed the nonlinear function constitution into the episodic memory. Simulation results show that neural network with nonlinear function constitution can effectively increase episodic memory storage capacity. In the existing episodic memory models using hetero-associations, the steady-state period for each pattern is same in the episodic memory. In this study, a novel neural network for episodic memory with controllable steady-state period based on coherent spin-interaction is proposed. By introducing a new sampling function, the steady-state period can be controlled by scale parameter and the overlap between the input pattern and the stored patterns. Ascribing to the coherent spin-interaction, the episodic storage capacity is enlarged compared with the existing episodic memory models. Furthermore, the the episodic storage capacity has exponential relationship to the dimension of the neural network.
Keywords/Search Tags:dynamic depressing synapse, chaotic neurons, chaos control, Hopfield neural network, associative memory, episodic memory
PDF Full Text Request
Related items