Font Size: a A A

Research On Momentum Optimization Methods Of Convolutional Neural Network Based On Fractional-order Calculus Theory

Posted on:2022-02-16Degree:MasterType:Thesis
Country:ChinaCandidate:T KanFull Text:PDF
GTID:2518306314493584Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
The recognition accuracy and convergence speed are the key indicators to verify the effectiveness of the image recognition technology.It is necessary to improve the insufficient adaptability and stability of the stochastic classical momentum(SCM)algorithm and adaptive moment(Adam)estimation algorithm in order to effectively improve the recognition accuracy and convergence speed of convolutional neural network.Therefore,different fractional-order operations are used to establish neural network momentum optimization methods based on fractional-order calculus theory,and several aspects are accomplished as follows:(1)This paper proposes a parameter training method via the Grünwald-Letnikov(G-L)difference for convolutional neural networks.To update the parameter of CNN more smoothly,the parameters training method via the fractional-order momentum is proposed based on the G-L difference operation.The SCM algorithm and Adam estimation algorithm are improved by replacing the integer-order difference with the fractional-order difference.Meanwhile,the linear and the nonlinear methods are discussed to adjust the fractional-order.Therefore,the proposed methods can improve the flexibility and the adaptive ability of CNN parameters.(2)The historical momentum information is effectively truncated,and the appropriate order interval is selected for the training method of fractional-order CNN based on G-L difference.Meanwhile,two different adaptive adjustment methods are used to adjust the fractional-order.Analyzing the validity of the model by using MNIST dataset and CIFAR-10 dataset.The proposed methods can improve the recognition accuracy and the learning convergence speed of CNN compared with the traditional SGD and Adam methods.(3)This paper proposes parameter training methods via the Hausdorff difference for convolutional neural networks.To update the parameters of CNN more smoothly,the parameter training methods based on the Hausdorff difference operation is proposed to self-adaptively adjust the momentum and the learning rate.The SCM algorithm and Adam algorithm are improved by replacing the integer-order difference based on the Hausdorff difference with the orders.Meanwhile,two types of adaptive adjustment methods are discussed to set the order.Therefore,CNN can be trained smoothly by retaining the momentum information and self-adaptively adjusting the learning rate.The stability and adaptability of CNN parameter updating are improved.(4)For the training method of CNN parameters based on Hausdorff difference,the appropriate interval of order is determined by analyzing the influence of order on momentum information.Meanwhile,the order is adjusted adaptively according to the update of CNN parameters.Analyzing the validity of the proposed tuning methods by using Fashion-MNIST dataset and CIFAR-10 dataset.The proposed methods via Hausdorff difference can improve the recognition accuracy and learning convergence speed of the CNN compared with the SGD and Adam algorithms via integer-order difference.
Keywords/Search Tags:Convolutional neural network, Grünwald-Letnikov difference, Hausdorff difference, Adaptive adjustment, Recognition accuracy
PDF Full Text Request
Related items