Font Size: a A A

The Network Synchronization And Application In A Memory Model Based On An Adaptive Synaptic Learning Rule

Posted on:2013-04-30Degree:DoctorType:Dissertation
Country:ChinaCandidate:C K YanFull Text:PDF
GTID:1228330377958192Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Cognitive neuroscience is a new multidisciplinary science concerned with study of the nervous system and cognitive function. It overlaps with disciplines such as dynamics, computing science and neurobiology. New theoretical approaches are formed with the hope of clarifying the brain mechanisms of all human activities.The research object of cognitive neuroscience is the nervous system. The learning of neural network is the synaptic plasticity of neurons and the learning rule of synapses has been one of the hot spots. An adaptive learning rule of synapses is presented in this paper and Lasalle invariance principle can be employed to prove its effectiveness. The evidence shows that the algorithm has better adaptability, which applies not only to symmetric network in engineering, but also to the asymmetry of synaptic connections in biological nervous systems. Then, due to the characteristics of biological neural network, a new algorithm is designed to improve the generative method of the NW small-world network. By this way, the new network can better simulate the highly irregularity and asymmetry of neural networks. The adaptive algorithm and improved NW networks are used to simulate the firing of the network with fast-slow scale ML neural model as the basic unit. The results show that synaptic connection weight can converge to an appropriate strength and that under such a strength the network comes to synchronization. Moreover, a dynamic correlation coefficient is defined to portray this dynamic synchronization process. ISI (inter-spike interval) of synchronization orbit in neural networks has a typical period doubling bifurcation. It is a further improvement compared with bifurcation of the traditional single neuron model, which promotes our understanding of neuron population activities.Physiological experiments found that the network synchronization can encode information and achieve some functions of neural networks, but the neurons in neural networks are different. Therefore, this article introduces distribution functions of random parameters to model this non-identity of this neural network. By this adaptive learning rule of synapses, the numerical simulation results show that non-identical neuron population can still be synchronized. It means that this learning rule has the robustness on mismatch parameters. The synchronization has not been influenced by the initial differences of individual orbits for neurons generated by non-identity. Neurons in different discharge rhythm or in the chaotic orbit after a complicated transition process can be synchronized to the same orbit. After analysis, it is found that the period and the discharge rhythm of synchronization orbits are totally determined by the characteristics of network topology, especially the mean values of neuron parameter distributions. And under different distributions of parameters, the network can be synchronized to any periodic orbits or even chaotic asstractors. In order to express the phase synchronization of the network, an average phase difference is defined. We can not only get the results of phase synchronization between any two neurons, but also the phase synchronization of whole network. It shows that the computing method is effectiove.Learning and memory are one of the most important activities of the brain. Hippocampus is the key structure to learning and memory. In this paper, for the physiological and anatomical structures of the hippocampus in the brain, a simplified mathematical model is built to make numerical analysis and simulation of the hippocampal memory function from a computational neuroscience perspective. First, this model is used to analyze the memory expression in CA1under PP (Perforant Path) signal from entorhinal cortex and current from SC (Schaffer Collaterals) postsynaptic current respectively. The results show that single subthreshold signal cannot cause memory but stochastic resonance of them is the reason of memory. Then, Signal-to-Noise (SNR) ratio calculations show that SNR can reach peak value under certain SC synaptic strength, that is, a typical stochastic resonance exists when PP weak signal is detected. Finally, for the unclear memory in stochastic resonance, an adaptive learning of synapses is introduced to strengthen the links between neurons. The simulation results indicate that after a sufficiently long time of learning, the hippocampus networks tend to mature, which can cause the incomplete memory of the immature network to be clear expressions in the mature network. These models and results provide the possibility of a mechanism for memory, which plays a significant role in understanding the generation of memory.The studies on neural network and dynamics analysis are done by lots of researchers while there are few new models about single neuron. In addition, it is necessary to figure out the encoding mechanism of the neural network when studying artificial intelligence. Many studies are done on the hypothesis about the coding, but there is still not a unified point of view. We get a new dynamical model based on Hamilton principle from neural physical circuit. The discharge of neuron can be simulated successfully, which shows this model can simulate all kinds of discharge rhythms in the experiments. Furthermore, we discuss the system generalized energy consumption when the neuron is firing. The variety patterns of energy maybe contain some coding about information transmission between neurons.
Keywords/Search Tags:neural system, synaptic learning, synchronization, learning and memory, Hamilton function
PDF Full Text Request
Related items