Font Size: a A A

Synergistic Learning In Neural Networks:Theo- Ries And Applications

Posted on:2016-04-05Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y K LiFull Text:PDF
GTID:1318330482472517Subject:Circuits and Systems
Abstract/Summary:PDF Full Text Request
In neuroscience, it is widely believed that learning and memory are primarily based on synaptic plasticity which is a neural mechanism modifying the strength of connections between neurons. A large number of experimental and theoretical results have been accu-mulated in neuroscience. As a counterpart in machine learning, modification of connec-tion strength (weight) endows artificial neural networks with a powerful learning ability to solve various problems. Independent of modification for synaptic strength, recent experi-mental results have revealed that a single neuron also has the ability to change its intrinsic excitability to fit the synaptic input. This mechanism is referred to as neuronal intrinsic plasticity (IP) in literature.In computational modeling, intrinsic plasticity is related to the modification of activation functions. Computational learning rules for intrinsic plasticity have been developed based on the hypothesis of information maximization with a stable response level. With the discovery of this novel plasticity mechanism, we have proposed a novel spike-based neuronal intrinsic plasticity rule. Based on studying the intrinsic plastic-ity mechanism alone, we combined the intrinsic plasticity mechanism with the traditional study of synaptic plasticity. We defined this combination of these two different plastic-ity mechanisms in computational modeling and artificial learning systems as synergistic learning. We first introduced synaptic plasticity and intrinsic plasticity separately, then dis-played how synergistic learning plays a role in modeling neural learning systems and how it benefits the learning performance of artificial neural networks in machine learning. In computational neuroscience, we studied a computational model with intrinsic and synaptic plasticity to model experimental results in visual systems. This result shows how syner-gistic learning works in computational modeling. In machine learning, synaptic plasticity inspired weight modification algorithms in artificial neural networks. We introduced mod-ification of activation functions to artificial neural networks as intrinsic plasticity rules. In this study, we used information-theoretic learning methods for both weight modification (synaptic plasticity) and activation function modification (intrinsic plasticity). This result shows that synergistic learning improves the performance of artificial neural networks. This thesis is on the basis of computational modeling in theoretical neuroscience, and ultimately it is focusing on the engineering applications in information technology. It is trying to link the biological neural mechanisms to the brain-like information processing techniques.
Keywords/Search Tags:intrinsic plasticity, synergistic learning, homeostasis, information maxi- mization, information-theoretic learning
PDF Full Text Request
Related items