Font Size: a A A

Ordinary differential equation methods for some optimization problems

Posted on:2007-08-31Degree:Ph.DType:Thesis
University:Hong Kong Baptist University (Hong Kong)Candidate:Zhang, QuanjuFull Text:PDF
GTID:2440390005969566Subject:Mathematics
Abstract/Summary:
The contributions of the thesis are to propose new neural network models based on ordinary differential equation systems (ODEs) and to develop differential algebraic equation systems (DAEs) method for some optimization problems. Ordinary differential equation systems can be related to neural network models, usually implemented by analogy integrated circuits, and by which scientific computation can be realized in real time on line. So, the neural network method based on ODEs has gained broad attentions since it was first proposed in 1980s and various neural network models for different optimization problems have been studied in the past three decades. The main feature of the neural network approach is that a continuous path starting from the initial point can be generated and the path will eventually converge to the solution of the optimization problems. Another new method, also different from traditional optimization algorithms in which a sequence of iterative points is generated to find the optimal solution, is that a set of differential equation systems coupled with a set of algebraic equation systems are employed to track the optimization problem's solution. This thesis will focus on both the ODEs based neural network method and the DAEs method for several optimization problems.;The first contribution of the thesis presents a novel recurrent time continuous neural network model which performs nonlinear fractional optimization subject to interval constraints on each of the optimization variables. The network is proved to be complete in the sense that the set of optima of the objective function to be minimized with interval constraints coincides with the set of equilibria of the neural network. It is also shown that the network is primal and globally convergent in the sense that its trajectory cannot escape from the feasible region and will converge to an exact optimal solution for any initial point being chosen in the feasible interval region. Simulation results are given to further demonstrate the global convergence and good performance of the proposed neural network for nonlinear fractional programming problems with interval constraints.;A differential-algebraic equation method for solving convex quadratic programming is proposed in this thesis and by this method optimal solutions can be located by tracking trajectories of a set of ordinary differential equation systems coupled with a set of algebraic equation systems. It is proved that the DAE algorithm converges to an optimal solution in finite time for the case of optimum being all on a face of the feasible set. Furthermore, in the process of carrying out numerical schemes for the proposed DAEs, the well-known path-following interior point method is deduced again and hence it can be viewed as a special case of the new DAE method.;Similarly, illustrative numerical results indicate that the proposed DAE method provides an alternative approach in addition to both the traditional optimization method and the neural network method for solving convex quadratic programming problems. This is the second contribution of the thesis.
Keywords/Search Tags:Ordinary differential equation, Neural network, Method, Optimization, Thesis
Related items