Font Size: a A A

Research On Complex Gradient Learning Algorithms Based On Complex Stepsize

Posted on:2020-04-23Degree:MasterType:Thesis
Country:ChinaCandidate:L H XuFull Text:PDF
GTID:2428330602458089Subject:Mathematics
Abstract/Summary:PDF Full Text Request
In recent years,great progress has been made in the study of complex-valued neural networks.In the training of complex-valued neural networks,the most popular algorithm is complex gradient learning algorithm,which can be implemented by batch learning or online learning.However,this algorithm still has many shortcomings,such as easily falling into local minimum,slow convergence,and so on.In general,the stepsize of complex-valued gradient learning algorithm is set as a positive number.Compared with the real-valued learning rate,the complex value learning rate has more advantages.First,the search space of complex gradient learning algorithm is expanded from half line to half plane,making the learning process more free.Secondly,the Hessian matrix information of the objective function can be approximated more effectively,which makes the algorithm more accurate.Third,the convergence of the complex gradient learning algorithm can be accelerated by escaping from the saddle point faster.In this thesis will make further research on the complex gradient learning algorithm with complex stepsize.The specific work is listed:(1)In order to optimize the complex gradient learning algorithm,we study the complex gradient learning algorithm with complex stepsize.We first introduce a batch complex gradient learning algorithm with complex stepsize and adaptive momentum,indicating that the gradient of the error function goes to zero and the weight sequence goes to a fix point,respectively.Simulations show that the complex gradient learning algorithm with adaptive momentum term and complex learning rate is superior to the complex gradient learning algorithm with complex learning rate.(2)When the sample redundancy is high,online learning mode is a better choice than batch learning mode.Thus we then extend the convergence results of the complex batch gradient learning algorithm with complex stepsize to the online learning mode,and verify the effectiveness of the algorithm theoretically and numerically.(3)In order to improve the convergence rate of the online complex gradient learning algorithm,we further analyze the online complex gradient algorithm with momentum and complex stepsize.Through the simulation results of an approximation problem,it is concluded that the online complex gradient learning algorithm with momentum terms converges faster than the corresponding algorithm without momentum.
Keywords/Search Tags:Complex-Valued Neural Networks, Complex Gradient Method, Momentum, Complex Learning Rate
PDF Full Text Request
Related items