Font Size: a A A

Research On Neural Network Model For Solving L1-norm Problem

Posted on:2019-05-25Degree:DoctorType:Dissertation
Country:ChinaCandidate:C P LiFull Text:PDF
GTID:1368330545974042Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Differing from the traditional numerical methods,the neural network has the characteristics of parallel processing,distributed storage and so on.Therefore,since Hopfield and Tank first applied neural network to solve linear programming problems in 1980s,the application of neural network to solve optimization problems has been widely considered,and some important achievements have been made.However,many models still have many shortcomings such as large dimension of the variables,high complexity,and so on.Based on the research status of l1-norm problems,neural networks for these problems are studied.In addition,since l1-norm problems can be solved to minimax problems,a,class of minimax problem is also studied in this paper.Feasible and effective neural networks for solving the above two kinds of prob-lems are proposed,respectively.And we strictly analyze their behavior.The mod-els proposed in this paper have good stabilities and convergence,and overcome the shortcomings of the existing models.The specific work is as follows:1.A new neural network for a class of minimax problems with equality con-straints is designed,by using variable substitution and saddle point theorem to construct two projective equations equivalent to saddle point conditions.We con-struct a suitable energy functions,and prove that the model is Lyapunov stable when the objective is convex and concave on the set of equality constraints.For any initial point,their state trajectories and the output,trajectories will converge to one equilibrium point,of the system and one exact,saddle point of the minimax problem,respectively.Compared with some existing models,the proposed neural network needs the fewest neurons,low complexity and weak stability conditions.Since the proposed model can be used to solve a large class of optimization and related problems,it has great application value.2.A new neural network for a special class of l1-norm problems with equality constraints is designed,by introducing a new variable and variable substitution.Compared with some existing neural networks,the model has the fewest neurons and low complexity.By introducing a Lyapunov function,the model's stability and asymptotic stability are proved.3.A new one-layer neural network model is designed for a class of least absolute deviation problems with inequality constraints,by using variable substitution and saddle point theorem.The problem is transformed into a minimax problem,by introducing a new variable.Compared with some existing neural networks,the proposed model needs fewer neurons and lower network structure.By introducing a Lyapunov function,the model's stability and asymptotic stability are proved.4.A new one-layer neural network for a class of least absolute deviation prob-lems with equality constraints is designed,by using variable substitution and saddle point theorem.The problem is transformed into a minimax problem,by introducing a new variable.Compared with some existing neural networks,the proposed model needs the fewest neurons and lower network structure.The model's stability and asymptotic stability are proved,by introducing a Lyapunov function.The simulation results show the validity of the proposed models and neural networks for l1-norm problems works well for the image recovery problem.
Keywords/Search Tags:l1-norm problems, Minimax problems, saddle points, neural networks, Lyapunov function, stability, convergence
PDF Full Text Request
Related items