Font Size: a A A

Algorithm Design And Analysis For Fractional Order Complex-Valued Neural Networks

Posted on:2019-11-20Degree:MasterType:Thesis
Country:ChinaCandidate:G L YangFull Text:PDF
GTID:2428330620464858Subject:Mathematics
Abstract/Summary:PDF Full Text Request
The complex-valued neural networks are the class of networks that solve complex problems by using complex-valued variables.The gradient descent method is one of the popular algorithms to train complex-valued neural networks.At present,most of the established networks are integer-order models.Compared with classical integer-order models,the built models in terms of fractional calculus possess significant advantages on both memory storage and hereditary characteristics.Derived from the properties of fractional differential systems and geometric meaning of complex number,fractional-valued complex neural network has better memory property than integer-valued complex neural network.This paper is developed on the model of split-complex neural network(SCNNs).The fractional derivative is used to train split-complex neural networks.According to the definition of the gradient of the error function with respect to weight,two methods of updating the weight are proposed.The monotonicity of the proposed algorithms has been obtained based on the calculus mean value theorem and inequality analysis technique.Furthermore,the norms of the gradient of the error function with respect to weights are proven to be close to zero as the iteration approaches infinite.This property guarantees the deterministic convergence of the proposed algorithm from a mathematical point of view.In addition,numerical simulation has effectively verified its competitive performance and also illustrated the theoretical results.The main contributions of this work are as follows:1.A fractional-order complex neural network algorithm is proposed.By using FSDM,a fractional-order complex valued neural network(FCBPNNs)based on Faà di Bruno's formula is described.2.A fractional-order complex neural network learning algorithm based on Caputo definition is proposed.Under suitable conditions for activation functions and the learning rate,we combined the split-complex back propagation method with the definition of fractional order to update the weights,and the monotonic decreasing property of the error function has been guaranteed.3.The convergence of the proposed fractional-order complex neural network algorithm has been rigorously proved,which prevents the divergence behavior from a theoretical point of view.4.Numerical simulations are reported to illustrate the effectiveness of the proposed fractional-order complex neural network algorithm and support the theoretical results.The XOR problem has been employed to compare the error curves of complex BP neural networks with different fractional-order and integer-order.
Keywords/Search Tags:Complex-Valued Neural Networks, Fractional Derivative, Monotonicity, Convergence
PDF Full Text Request
Related items