Font Size: a A A

Algorithm Design And Convergence Analysis For Fractional-order Neural Networks

Posted on:2018-03-17Degree:MasterType:Thesis
Country:ChinaCandidate:Y Q WenFull Text:PDF
GTID:2428330596468752Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Artificial neural networks have extensive applications in many fields such as pattern recognition,function approximation,signal prediction,and automatic control due to its satisfying intelligence ability.Out of many kinds of neural networks single hidden layer feedforward networks with the universal approximation capabilities have been investigated more thoroughly.BP algorithm based on gradient descent method is the most popular and important learning algorithm for training single hidden layer feedforward networks.However,it is obvious that BP algorithm is not effective enough due to its slower convergence speed and time-consuming process.A fractional adaptive learning(FAL)method-fractional gradient descent method has gained wide increasing attention with its simple and fast characteristics.In order to combine the benefits of the two kinds of algorithms,we propose a neural network algorithm based on FAL method which is called FAL-BP algorithm.We introduce a specific Caputo differential operator,and update the weight sequence of neural network in the training process;with the help of the calculus mean value theorem and the inequality analysis technique,we give the convergence conclusion of the FAL-BP neural network.Furthermore,the numerical experiment shows that there is an optimal fractional order for different problems.The main contributions of this paper are as follows:1.A FAL-BP neural network algorithm is proposed.Under suitable conditions for activation functions and the learning rate,we combining FAL method and BP algorithms to update the weights,the monotonic decreasing property of the error function has been guaranteed.2.The strong(weak)convergence of the proposed fractional-order training algorithm has been rigorously proved,which prevents the divergence behavior from a theoretical point of view.3.Numerical simulations are reported to illustrate the effectiveness of the proposed FAL-BP neural networks and support the theoretical results.The benchmark UCI datasets have been used to compare the performances of FAL-BP neural networks with integer-order gradient descent method for BP neural networks.The FAL-BP algorithm shows a better performance on the classification databases.
Keywords/Search Tags:Fractional Calculus, Back-propagation, Caputo Derivative, Monotonicity, Convergence
PDF Full Text Request
Related items