| Vector optimization is an important problem in the field of mathematics and has important applications in real life.Newton method is an important method for solving nonlinear equations and nonlinear optimization problems,which plays an important role in equation solving and classical optimization problems.Therefore,it is necessary to extend the classical Newton method to solve vector optimization problems,and it is also an important to analyze the convergence and estimate the error of the extended Newton methods.In this thesis,we continue to study the convergence of the extended Newton methods for solving unconstrained vector optimization problems under partial orders induced by general closed convex pointed cones.For the extended Newton methods(without or with the line-search scheme),by majorizing function technique,establish quadraticly convergence criteria,and estimate a radius of convergence ball under the assumption that the Hessians of objective functions satisfy an L-average Lipschitz condition.Our main results,concerning the three types of convergence properties,are described as follows:(i)The semilocal convergence property(i.e.,Theorem4.1,Theorem4.2),the Kantorovich type convergence theorem of the extended Newton methods(without or with the line-search scheme)is established,provides explicit convergence criteria.At an initial point,the ob-jective function satisfies the strong K-convexity,its second derivative satisfies an L-average Lipschitz,and the relevant parameters satisfy some assumptions,establishes quadraticly con-vergence criteria,and converges to a K-minimizer.At the same time,the error estimate is provided.In particular,as an application,when the objective function F satisfies the classical Lipschitz condition or F satisfies theγ-condition,establishes quadraticly convergence criteria(i.e.,Theorem5.1,Theorem5.4)and provides the error estimate.(ii)The local convergence property(i.e.,Theorem4.3,Theorem4.4),establish the local convergence theorem of extended Newton methods(with or without the line-search scheme),estimate radius of the convergence sphere.Assumption that there exists a solution x~*,the objective function satisfies the strong K-convexity,and its second derivative satisfies an L-average Lipschitz condition,and then finds an open ball of x~*,so that for all initial points with the open ball,the sequence generated by the algorithm is well-defined and quadraticly converges to a local K-minimum.In particular,as an application,when the objective function F satisfies the classical Lipschitz condition or F satisfies theγ-condition,establish the local convergence theorem(i.e.,Theorem5.2,Theorem5.5)of extended Newton method(with or without the line-search scheme)and the convergence sphere is estimated.(iii)The global convergence property(i.e.,Theorem5.3),the global convergence theorem of the extended Newton methods(with the line-search scheme)is established.Under the Lipschitz condition of the second derivative,some sufficient conditions are established at the convergence point to ensure the global convergence of the extended Newton methods,which is not only applicable to the Armijo line-search scheme,but also to the Goldstein/Wolfe line-search scheme.These quadraticly convergence theorems significantly improve the corresponding ones in[Drummond L M G,Raupp F M P,Svaiter B F.A quadratically convergent Newton method for vector optimization[J].Optimization,2014,63(5):661-677].The results of semi local convergence and local convergence are presented for the first time,significantly improving its original research results;Furthermore,under the more general assumption of continuity,the convergence results of the extended Newton method have become special cases of this thesis. |