Font Size: a A A

Research Of Algorithm On Neural Network For A Class Of Nonlipschitz Optimization Problems

Posted on:2018-07-16Degree:MasterType:Thesis
Country:ChinaCandidate:W J LiFull Text:PDF
GTID:2348330536482365Subject:Basic mathematics
Abstract/Summary:PDF Full Text Request
This thesis studies a class of non-smooth,non-convex and non-Lipschitz continuous optimization problems which have many important applications in image restoration,variable selection,signal processing,etc.At present,the effectiveness,convergence and stability of existing algorithms are not ideal and there are some deficiencies,so the main purpose of this paper is to construct a neural network algorithm with fast convergence rate,simply stable structure and complete convergence.In the thesis,the Clarke stationary point is generalized and generalized stationary point of the optimization model with stronger optimization performance is defined.The smoothing function is constructed for the non-Lipschitz item of objective function by smoothing techniques which can overcome non-smoothness of the non-Lipschitz item,the constraint is handled by projection operator,the neural network for the optimization model is constructed and the relationship between the unconstrained optimization models and the constrained optimization models which the generalized stationary point is in the feasible region is analyzed.Under the condition that the level set of objective function is bounded,it is proved that the solution of the proposed neural network is globally existent,uniformly bounded and unique.And any accumulation point of the proposed neural network is a generalized stationary point of the optimization model.Based on certain suitable condition,it is proved accumulation point of the proposed neural network can be unique.Then,it is proved that the neural network corresponding to the optimization model with certain properties can be transformed into a smoothing gradient system,and a class of optimization models whose generalized stationary points are interior points of the feasible domain is given.In addition,the Kurdyka-?ojasiewicz index is an important index to analyze convergence rate of algorithms.The objective function of a class of optimization models with Kurdyka-?ojasiewicz property in a certain range is analyzed.Finally,the convergence of the neural network is verified by several numerical examples.
Keywords/Search Tags:non-Lipschitz continuous, neural network, generalized stationary point, convergence
PDF Full Text Request
Related items