Font Size: a A A

Sequential Learning Of RBF Neural Networks With Applications

Posted on:2020-02-05Degree:MasterType:Thesis
Country:ChinaCandidate:Z Q ShaoFull Text:PDF
GTID:2428330578977898Subject:Electronic and communication engineering
Abstract/Summary:PDF Full Text Request
In many practical scenarios,it is difficult to obtain complete training samples.The training samples arrive in sequence.With the development of Big Data Science,the rapid growth of data volume brings many problems for neural network learning,such as the high complexity of the algorithm,slow response and so on.Sequential learning can effectively solve these problems.Radial basis function(RBF)neural networks have been extensively used in a sequential learning framework due to their common function approximation ability and simple network structure.In this thesis,the similarity-based self-growth self-replacement self-deletion algorithm(S-GRP)is proposed to automatically determine the hidden layer structure of RBF neural network.The novelty of the learning samples is judged by generalized maximum minimum(GMM).Samples that meet the similarity criteria and the importance criteria of neurons are added into the network as a new hidden layer center.A neuron replacement strategy is added on the basis of the addition and deletion strategies.GMM is used to obtain spare hidden layer neurons through clustering.Instead of directly deleting unimportant hidden layer neurons,priority is given to replacing them with spare neurons meeting the importance of neurons.If the importance of the hidden layer neurons and spare neurons are low,the two are deleted.Experimental results demonstrate that S-GRP algorithm can smoothly generate RBF neural network with simple structure and good generalization ability.In order to improve the adaptability of our RBF training algorithm to noise and avoid error divergence,the adaptive robust extended Kalman filter algorithm(AEKF-OR)is proposed.The noise measurement covariance matrix is modeled as inverse Wishart distribution.The process noise covariance matrix is approximated by the covariance matching method,and the forgetting factor is added.AEKF-OR algorithm has good robustness to outliers in data and can improve the estimation accuracy of the algorithm.Facing the problem of online sequential data imbalance problem,an RBF sequential learning algorithm called sampling and fuzzy self-growing self-deletion algorithm(S-FGP)is proposed.The fuzzy self-growth and self-deletion algorithm(FGP)is first proposed to construct RBF neural network using suitable samples as the center of the hidden layer neurons.Then,the S-FGP algorithm combines the sample sampling process with the FGP algorithm training network process.In the learning process,the unimportant majority class samples are screened and deleted through training errors.The S-FGP algorithm uses the added hidden layer neuron centers as pseudo samples and the currently learned minority class samples to generate minority class samples,so as to promote the training set to be balanced.S-FGP algorithm can effectively improve the classification accuracy of minority class samples and reduce the accuracy loss of majority class samples as much as possible when facing the problem of imbalance of online sequential data.
Keywords/Search Tags:Sequential learning, RBF neural network, RBF training algorithm, Sequential class imbalance problem
PDF Full Text Request
Related items