Font Size: a A A

On Inexact Newton Methods With Automatic Differentiation And Their Extension

Posted on:2003-10-14Degree:DoctorType:Dissertation
Country:ChinaCandidate:H B ZhangFull Text:PDF
GTID:1100360065962169Subject:Management Science and Engineering
Abstract/Summary:PDF Full Text Request
Nonliuear optiInization plays an important role in many fields such as science computationand engineering analysis. For solving nonlinear optimization problems, Newton method is oneof the most efficient methods. As modification to Newton method, inexact Newton methodsare popular for solving the middle and large scale nonIinear optimization problems. Amongthem, CF-PCG (Newton-PCG) method is an efficient improvement to Newton method. It isproposed on the basis of the analysis on Newton method and the preconditioned conjugategradient method (PCG method). Numerical derivatives can be evaluated by symbolic differ-entiation (SD), finite difference approximation and automatic differelltiation (AD). AD hassignificant advantages over other two approaches. One of its most important applicationsis to improve the optimization algorithms by evaluating the relevant derivatives informationefficientlyThe aim of the work includes: to establish and study new algorithms--CF-PCG algorithrnswith AD; to establish and study the extended CF-PCG algorithm (ECFPCG).CF-PCG algorithms with AD is proposed on the basis of CF-PCG algorithms with SD,in addition to replace SD with AD, there are other significant modification to the algorithms.The results by theoretical analysis and numerical experiments implicate that CF-PCG algo-rithm with AD is an improvement to Newton method with AD. The improvement can beachieved under the mild conditiolls. For the same dimension problems, the larger the costto evaluate the objective function, the more the improvement of the new algorithm versusNewton method with AD. So, as the worst case when the computation cost of the objectivefunction is negligible, the improvement is strictly increasing with respect to n and tends to+oc approximately at a rate lnn/ ln2 when n - +oo. The results also show that the im-provement of CF-PCG algorithIn with AD over Newton method with AD is much greaterthan the improvement of CF-PCG algorithm witll SD over Newton method with SD.The methods which convergence order is greater than 2 are named as high-order methods.The improved method of tangent hyperbolas (IMTH method) is a popular method amongthe high--order methods. In this thesis, combining AD techniques and the thoughts of theCF-PCG algorithms, extended CF-PCG algorithm model is derived. Theoretical analysisand experimental results indicate that Algorithm ECFPCG1 and Algorithm ECFPCG2 es-tablished by specifying parameters are much more efficient than the IMTH method, androughly speaking, the relative efficiency of the algorithms versus the IMTH method tends to +00 at the asymptotic formula Inn/In3 when n tends to +00.
Keywords/Search Tags:Nonlinear Optimization, Inexact Newton Methods, Automatic Differentiation.
PDF Full Text Request
Related items