In this thesis, we propose two methods for solving convex constrained nonlinear equations, denoted by Algorithm A and Algorithm B, respectively. Based on the non-monotone techniques and L-M methods, we propose Algorithm A. Based on the idea of supermemory gradient method and gradient projection method, we propose Algorithm B.In Algorithm A, we obtain a trial step by solving a quadratic programming subproblem, if the trial step is not accepted, then a modified Amijo-type nonmonotone line search is performed to generate a new iterative point.In Algorithm B, a new iteretive point is generated by the idea of supermemory gragient method and gradient projection method to generate a new iterative point, based on a derivative-free line search technique. Thus reducing the computation account and being suitable for large scale nonlinear equation.Under some reasonable assumptions, the two algorithms are proven to be globally and locally convergent. Numerical results are also reported to show the efficiency of these proposed methods. |