| It is generally believed that the SOR iteration modifies the Gauss-Seidel method with the relaxation technique by adding a relaxation parameter, but in essence, the SOR method is obtained by modifying the JOR method with Seidel skill. In this paper, a new fast iterative method is proposed. We directly add a relaxation parameter to the Guass-Seidel method to get a new method, called SOR2method, and its promotion, called SSOR2method. Their convergence theorems are proved and the options of the most optimal relaxation factor selection are given. We introduce SOR2method to the linear saddle point problems to obtain SOR2-like methods and GSOR2methods. This paper gives SOR2-like method and GSOR2method convergence prooves and the optimal iteration parameter selection strategy. Numerical experiments show that:for strong diagonally dominant matrix, SOR2method, compared to the SOR method, has faster convergence, what’s more, optimal relaxation factor of SOR2method compared with the SOR method, is more convenient computing, SSOR2method phase SSOR method than in the time required for convergence shorter; for linear saddle point problems, SOR2-like method than SOR-like method required less time to converge, in theory, GSOR2method compared to GSOR method has faster convergence. The larger the dimension of the matrix is, the more obvious the trend is.The structure of this thesis is arranged as follows. In introduction, the progress of saddle point problems in recent years is introduced. In chapter1the origin of SOR iteration and SOR-like method for solving saddle point problems is presented. SOR2method for solving linear systems is proposed in chapter2. SSOR2method is presented and its convergence proof is given in chapter3. In chapter4, we use SOR2method to solve linear saddle point problems to get a new iteration, denoted by SOR2-like method. In chapter5we first introduce GSOR method and then propose GSOR2method to solve augmented linear systems. |