Font Size: a A A

Research And Analysis For Input Weights Dependent Complex-valued Neural Network Algorithm

Posted on:2020-11-10Degree:MasterType:Thesis
Country:ChinaCandidate:Q LiuFull Text:PDF
GTID:2518306500483444Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Complex-valued neural network is a type of neural network,which is extended from real to complex domain.Fully complex extreme learning machine(CELM)is an efficient algorithm,which owes faster convergence than complex backpropagation(CBP)algorithm.However,it needs more hidden neurons to reach the competitive performance.An upper-layer solution-aware algorithm(USA)has been proposed for training single hidden layer feedforward neural networks,which can be seem as an effective algorithm between BP and ELM.USA algorithm performs much better than its counterparts extreme learning machine(ELM)and gradient decent based backpropagation(BP)neural networks.Combine the advantages of CBP and CELM algorithms,two kinds of complex-valued neural network algorithms depending on input weights are proposed to train complex-valued neural network in this paper,which effectively solve the non-analytic problem of activation function in the process of network training.An input weights dependent complex-valued(IWDCV)learning algorithm based on Wirtinger calculus effectively solve the nonanalytic problem of the common activation functions during training neural networks.The monotonicity of the proposed algorithms has been obtained based on Wirtinger calculus and taylor formula.Furthermore,the norms of the gradient of the error function with respect to weights are proven to be close to zero as the iteration approaches infinite.The main contributions of this work are as follows:1.We propose an input weights dependent split complex-valued algorithm for the typical single-hidden layer split neural networks.During training process,the input weights are iteratively updated by the gradient descent method,and the output weights are always considered to be a nonlinear mapping of the input weights,which is simultaneously expressed by Moore-Penrose generalized inverse.This algorithm solves the conflict between analyticity and boundedness of activation functions in complex field.Compared with CELM and CBP,an illustrated experiment has been done in detail which observes the better generalization ability and more compact architecture for the proposed algorithm.2.we propose an input weights dependent fully complex-valued algorithm based on Wirtinger calculus for the typical single-hidden layer fully neural networks.During training,only the input weights are the unknown parameters which need to be iteratively tuned by the gradient descent method.And the output weights are essentially dependent on the updated input weight sequence.There exists a strong nonlinear relationship between the input and output weights through one least square technique.This algorithm represents the well-defined fully complex derivatives.3.The convergence of the proposed an input weights dependent fully complex-valued algorithm based on Wirtinger calculus has been rigorously proved,which prevents the divergence behavior from a theoretical point of view.4.An input weights dependent fully complex-valued algorithm based on Wirtinger calculus,numerical simulations including regression and classification problems have been done to has been employed to compare the error curves of other algorithms.
Keywords/Search Tags:Complex-Valued, Backpropagation, Extreme learning machine, Wirtinger Calculus, Convergence
PDF Full Text Request
Related items