Font Size: a A A

Application Research On Synergistic Learning Mechanism In Small-Size And Anti-Noise Neural Network

Posted on:2021-02-06Degree:MasterType:Thesis
Country:ChinaCandidate:Y H HuFull Text:PDF
GTID:2428330623968653Subject:Engineering
Abstract/Summary:PDF Full Text Request
Synaptic plasticity(SP)and intrinsic plasticity(IP)are important rules of biological brain learning.At present,artificial neural networks have introduced the concept of weights by drawing on the principle of SP.Although a few studies have also considered IP,the synergistic effect of the two types of plasticity learning rules on artificial neural networks is not completely clear.So far,the research of synergistic learning is based on the shallow network of information entropy.And only the application of data fitting has been studied.Small-size neural network with anti-noise ability is an important research direction in the current neural network field.And the brain is a model of high efficiency,energy saving and anti-noise.It is speculated that the two types of plasticity may provide efficient and complex information processing in the brain neural basis.In this thesis,we explored the synergistic learning situation of SP and IP in neural networks,and compared the synergy effect under the loss function based on information entropy and noninformation entropy.We conducted experiments on various applications such as data fitting,multi-classification tasks,and noise immunity.The main findings are as follows:(1)In data fitting applications,synergistic learning algorithms can speed up the learning speed of the network and improve the quality of network fitting.IP rules use a local information maximization algorithm.The combination of IP rules and synaptic learning rules improves the speed and quality of network learning.The changes of the slope and offset of the activation function of each neuron in the hidden layer and the output layer are discussed.The IP rule increases the average value of the slope of the activation function.(2)Under certain conditions,synergistic learning algorithms can speed up the learning rate of small-size neural networks in multi-classification tasks and improve learning quality.The local information maximization in IP rules can be used in conjunction with synapse learning rules other than error entropy minimization algorithms,such as minimizing entropy.Synergistic learning can work in both shallow and deep networks,but the effect is different.In an artificial neural network with IP rules,fewer neurons can achieve the same performance as more neurons in a traditional artificial neural network.At the same time,IP rules make the network more robust to synapse learning rate.(3)In noisy environments,synergistic learning algorithms can steadily improve the performance of neural networks.Under different noise environments,synergistic learning neural networks can obtain higher accuracy and smaller standard deviation,compared with artificial neural networks.Smaller standard deviation indicates that synergistic learning can enhance the robustness of the network in noisy environments.(4)Under the loss function based on non-information entropy,synergistic learning can still maintain faster learning speed and higher learning quality on small-size networks.The newly introduced loss function worked with back propagation training method with synergistic learning improves the performance of the network.We extend the synergistic learning algorithm to non-entropy neural networks successfully.The above results show that the neural network using the synergistic learning algorithm has the advantages of low resource consumption and good robustness,and shows better performance in the small-size neural network.This will provide new ideas for the development of small-size neural networks.
Keywords/Search Tags:intrinsic plasticity, synaptic plasticity, synergistic learning, artificial neural network, information entropy
PDF Full Text Request
Related items