Font Size: a A A

Optimization Algorithm And The Virtual Enterprise Partner Selection Problems

Posted on:2009-02-22Degree:MasterType:Thesis
Country:ChinaCandidate:Y L GuFull Text:PDF
GTID:2120360245999919Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
A new assumption is given on the scalar of conjugate gradient method to ensure that the direction is a sufficient descent direction. A new class of memory gradient method with errors is presented. The new method is applicable to various difficult problems encounted in practice because of that it considers the errors and rang of the scalar is wider. The global convergence properties of the new algorithm are discussed on condition that the gradient of the objective function is uniformly continuous and Armijo step size rule. Combining quasi-Newton equation with our new method, quasi-Newton method with errors is modified to have global convergence property. Numerical results show that the new algorithms are efficient.We propose a new non-monotone step size rule and analyze the global convergence of a Lampariello modified diagonal-sparse quasi-Newton method with errors. The new step size rule is similar to the Grippo non-monotone step size rule and contains it as a special case. We can choose a larger stepsize in each line search procedure and maintain the global convergence property of our Lampariello modified diagonal-sparse quasi-Newton method. Numerical results show that the new algorithms are efficient.We consider the virtual enterprise partner selection and management. In this paper, a three-step strategy is proposed. We clearly elaborate the optimization model based on genetic simulated annealing algorithm. Also, a management model of virtual enterprise partner is given in the paper.
Keywords/Search Tags:Memory Gradient Method, Diagonal-Sparse Quasi-Newton Method, Error, Convergence, Virtual Enterprise Partner Selection
PDF Full Text Request
Related items