Firstly we propose a new trust region algorithm with simple quadratic models for unconstrained optimization. Under certain conditions, the global convergence property of our new method is proved. Numerical results show that the new algorithm is efficient, and attractive for large-scale optimization problems.Then we propose a new trust region algorithm with simple quadratic models and line-search rule, and the algorithm does not resolve the sub-problem if the trial step results in an increase in the objective function, but performs a new inexact larger Armijo line search to obtain the next iteration point. Under certain conditions, the global convergence property of our new method is proved. Numerical results show that the new algorithm is efficient, and attractive for large-scale optimization problems.Finally HS conjugate gradient method for minimizing a continuously differentiable function f on R n is modified to have global convergence property. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continuously differentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with Armijo step size rule are established. Numerical results show that the new algorithms are efficient.
|