Font Size: a A A

Large-scale Limited Memory Methods Discussed

Posted on:2010-06-19Degree:MasterType:Thesis
Country:ChinaCandidate:L LiuFull Text:PDF
GTID:2208360275998409Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
The limited memory quasi-Newton method are known to be an effective techniques for solving large scale unconstrained optimization, they make simple approximations of Hessian matrices and require minimal storage. Recently, Wei zengxin, Zhang jianzhong, et al. respectively have been made to modify BFGS-type methods, which use both available gradient and function value information and have good numerical experiments.In this paper, we made two researches: The first one, based on the modify BFGS-type methods, A unified form is presented, we give a parameterθ,this can be seen an extension of BFGS-type methods. Under some suitable conditions, the global convergence property is established on twice continuous differentiable uniformly convex problems, and the R-linearly convergent is proved. Numerical tests on commonly used large scale optimization problems indicate that our implementation is more beneficial.On the other hand, based on the compact limited memory methods in Liu and Nocedal's paper, we also present the compacted unified form limited memory methods, their simplicity is one of their main appeals: they do not require storage Hessian metric, but directly get the search direction. That is to say , it is an ideal methods for large scale optimizations.
Keywords/Search Tags:Large scale optimizations, Limited memory method, BFGS modified methods, Global convergence, Compacted form
PDF Full Text Request
Related items