Font Size: a A A

Study On Global Stability Of Neural Networks For Solving Optimization Problem

Posted on:2010-01-31Degree:MasterType:Thesis
Country:ChinaCandidate:C H DanFull Text:PDF
GTID:2178360302959044Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
In 1980s, the United States well-known physicists Hopfield and Tank proposed a novel artificial neural network for solving linear programming problems. From then on, more and more attentions have been paid in research and application of such area. Comparing with traditional optimization algorithms, artificial neural network technique has more merits, such as faster convergence rate, hardware implementation and real-time control etc. In order to apply the artificial neural network to solve optimization problems, it is required to be a complete stable network, namely, all its output trajectories must converge to a stable equilibrium point or a stable equilibrium points set. Therefore, it is valuable to study the stability of the neural networks for solving programming problems.This paper is divided into five chapters. In Chapter 1, we review the research background and developments of neural networks. Besides, we show that it is necessary to investigate the stability of optimization neural networks.In Chapter 2, in order to present the neural network for solving programming problems, we introduce the basic theory on the optimization problem.In Chapter 3, we discuss the global stability for two classes of optimization neural networks. In Section 1, we investigate a project neural network for solving degenerate quadratic programming problems. By suitable Lyapunov function, we present the conditions which can ensure the global exponential stability of such network. In Section 2, we investigate a primal-dual neural network for solving the convex quadratic program and its dual.In Chapter 4, we discuss the global stability for a class of delayed Lagrange neural network. By Lyapunov functional method and LaSalle invariance principle, we give the some sufficient conditions ensuring the global exponential stability of this network.In Chapter 5, we discuss the global stability of a class of delayed projectd neural network. By using Lyapunov functional and linear matrix inequality (LMI) methods, we develop the some sufficient results ensuring the global exponential stability of this network.
Keywords/Search Tags:Neural networks, Optimization problem, Global stability, Global exponential stability, Variational inequality, Lyapunov function, LaSalle invariance principle, Linear matrix inequality
PDF Full Text Request
Related items