Font Size: a A A

Optimization Of Convolutional Neural Networks Based On Fractional-order Momentum Method And Its Application In Image Recognition

Posted on:2022-11-21Degree:MasterType:Thesis
Country:ChinaCandidate:J JianFull Text:PDF
GTID:2518306773480424Subject:Automation Technology
Abstract/Summary:PDF Full Text Request
Convolution neural network is mainly used in image recognition.Different momentum methods can improve the accuracy of image recognition.At the same time,different methods are used to avoid over fitting and under fitting.In order to improve the accuracy of image recognition,the combination of fractional order and convolutional neural network can be used.In this paper,the fractional calculus theory is introduced into the structure design and weight optimization of convolutional neural network.The super parameter of fractional order can be adjusted adaptively to improve the performance of the network.Using Tustin generating function,Borges fractal derivative and Hausdorff fractal derivative,the weight and parameters of convolution neural network are optimized,which can adjust the weight and bias parameters more flexibly.The following aspects are completed.1)A fractional-order average momentum method based on Tustin generating function is proposed to train the parameters of convolutional neural network in back propagation.Taking the classical MNIST dataset as the training and test data,the accuracy and effectiveness of the fractional average momentum method based on Tustin generation function for image recognition of convolutional neural network are verified.The experimental results in this paper show that the fractional-order average momentum method based on Tustin generating function can improve the recognition accuracy and learning convergence speed of convolutional neural network.2)An adaptive momentum algorithm based on Borges difference and an adaptive momentum algorithm based on Borges difference are proposed.Adaptive motion(Adam)algorithm is used to update the parameters of weight and bias,which can adjust the momentum information of training more flexibly.After introducing the Borges difference form from the definition of Borges fractal derivative,and combining the Borges difference with the gradient algorithm in convolution neural network,a nonlinear adjustment method for parameter adjustment of convolution neural network based on Borges difference is proposed.By analyzing the experimental results of Fashion-MNIST dataset and CIFAR10 dataset,the nonlinear optimization algorithm based on Borges difference proposed in this paper has better effect on the optimization model than the optimization algorithm based on integer order difference,and can accelerate the convergence speed and recognition accuracy of image recognition.3)This paper presents a root mean square prop(RMSProp)algorithm based on Borges difference and Hausdorff difference RMSProp algorithm convolution neural network parameter adjustment method.Through the analysis of the experimental results of CIFAR-10 dataset,the RMSProp algorithm based on Borges difference and Hausdorff difference proposed in this paper have better training results in the optimization model than the integer-order RMSProp optimization algorithm.The RMSProp algorithm based on Borges difference and Hausdorff difference proposed in this paper can adjust the convergence speed more conveniently and quickly.
Keywords/Search Tags:Borges difference, Convolution neural networks, Momentum, Adam algorithm, Nonlinear adaptive tuning method
PDF Full Text Request
Related items