Convergence Analysis And The Algorithm Construction Of Stochastic Radial Basis Function Neural Networks  Posted on:20210121  Degree:Master  Type:Thesis  Country:China  Candidate:Y A Zhang  Full Text:PDF  GTID:2428330602982559  Subject:Mathematics  Abstract/Summary:  PDF Full Text Request  The Artificial neural networks is an intelligent model for information processing by simulating the organization and mechanism of the brain's nervous system.Thanks to its powerful selflearning ability,artificial neural networks can liberate people's labour force largely,and therefore have been widely researched and applied by people.The theoretical research on artificial neural networks can efectively explain the applicable principles and potential problems of neural networks,so that they can be effectively improved and developed.On the other hand,the artificial neural networks with random learning can overcome the problems of slow convergence and a local minimum of the traditional gradient learning algorithm,and it has recently become one of the hotspots of people's research.This paper mainly studies the stochastic radial basis function artificial neural networks with uniformly distributed centres and smoothing factors of radial basis function neural networks.This random learning method is an important supplement on how to determine centres and smoothing factors of the radial basis function neural networks,and it can improve the training efficiency of the network as well.The main research results of this paper are developed by analyzing the convergence of random radial basis function neural networks and constructing corresponding models and algorithms.(1)To explore the function approximation ability of radial basis function neural networks with random weights,the convergence analysis of the random weight feedforward neural networks convergence analysis method is used.Firstly,the nature of the generalized ? function is used to construct a limit integral expression of the approximated function;secondly,the Monte Carlo method is used to calculate the integral in this expression,proving that the radial basis function neural networks with random weights can approximate any continuous function.The theoretical analysis of the convergence characteristics of the radial basis function neural networks with random weights shows that the convergence error gradually decreases with the increase of the hidden layer neuron nodes,indicating that in fact,this efficient function approximator has the potential to deal with big data problems.To verify the theoretical results,we performed fitting and classification tests on the radial basis function neural networks with random weights on the relevant data sets.The experimental results show that the stochastic radial basis function neural networks does have fast learning ability and strong function approximation ability.(2)Based on the stochastic radial basis function neural networks regularization model,through the adaptive selection of regularization factors,the corresponding adaptive regularization model and ARBFNN algorithm are proposed.The basic idea of adaptively selecting the regularization factor is to transform the regularization factor from the original constant into a function about the weight of the output layer.Then the iterative algorithm is used to determine the value of the regularization factor at the same time as the output layer weights obtained in each step of the iterative process.Furthermore,it is necessary to select an appropriate regularization function to ensure the convexity of the adaptive regularization model so that the model has a global minimum independent of the initial conditions.To further verify the effectiveness of the algorithm and the generalization ability of the model,the data of the UCL machine learning database is used to test the adaptive regularization model of the radial basis function neural networks with random weights.The results show that the ARBFNN algorithm can reduce the training error of the stochastic radial basis function neural networks and improve the generalization ability of the networks by comparing with other algorithms.At the same time,it also can handle large data set problems.  Keywords/Search Tags:  Neural networks, Radial basis function, Convergence analysis, Regularized model, Adaptive  PDF Full Text Request  Related items 
 
