Font Size: a A A

Analysis On The Convergence Of Several Iterative Methods For Solving Linear Systems

Posted on:2015-10-04Degree:DoctorType:Dissertation
Country:ChinaCandidate:Q F XueFull Text:PDF
GTID:1220330434451282Subject:Basic mathematics
Abstract/Summary:PDF Full Text Request
Abstract The large and sparse linear systems must have to be solved in the science and engineering fields. In spite of rapid development of computer technology, people have to face the challenges such as large storage capacity, long time for a solution required and high complexity in the solving process. Thus it is needed to further study to look for algorithms with small amount of calculation and numerical stability. Due to saving the memory space, easy to be proceeded in parallel and solving fast, the iterative methods have always being active research topics in the fields.It is well known that the convergence and the convergence speed of an iterative method are the theoretical foundation of the application. In order to improve the con-vergence and convergence speed, the extrapolated iterative methods and preconditioned iterative methods are proposed by using the extrapolation technique and pretreatment tech-nique for classic iterative methods, respectively. The rich results for the two methods can be found, but there are still many issues worthy of discussion. For example, the con-vergence of the extrapolated iterative methods still needs further analysis for the differ-ent classes of matrices. In particular, the relationship between the convergence of the extrapolated Gauss-Seidel iterative method and H-matrices requires further discussion. Meanwhile it is also worthy of studying whether classic preconditioned iterative methods might be generalized and improved to obtain effective and practical preconditioned itera-tive methods. Moreover, researchers proposed two-stage iterative methods due to parallel computing and studied the convergence and the influence of the number of inner iterations on convergence speed. But there is still a problem to study whether their convergence is improved and the convergence speed is accelerated in contrast to the outer iterative meth-ods. Based on the above considerations, we study the convergence of several iterative methods in the thesis.1. The thesis discusses the relationship between the convergence performance of the extrapolated Gauss-Seidel iterative method and H-matrix. First, we provide the relation-ship between the convergence of the extrapolated Gauss-Seidel and the Jacobi iterative method, and the range of the extrapolated parameters to ensure that the method converges. Second, the upper bound estimation for the spectral radius of the extrapolated Gauss-Seidel iterative method is obtained by using the optimal scale matrix. Moreover we also provide several equivalent conditions for general H-matrices based on the Gauss-Seidel iterative method and its extrapolated version, respectively.2. The thesis then studies the convergence of stationary two-stage and stationary al-ternating two-stage iterative methods, respectively. The comparison results for stationary two-stage iterative methods and their outer ones (the standard iterative methods) are pre-sented under the mild conditions. That is to say that the convergence speed of the standard iterative methods is faster than that of the stationary two-stage iterative methods with the proper splittings. Meanwhile, similar results are also obtained for stationary alternating two-stage iterative methods.3. Preconditioned AOR iterative methods are discussed. First, we propose a new choice of’r, t’ for the preconditioner I+C proposed by Zhang Y. et al. For a nonsingular M-matrix, we prove that the preconditioned AOR iterative method with new choice is convergent and has the faster convergence speed than the original AOR method. Second, using the theories of matrix splitting, we get the convergence of a class of preconditioned AOR methods with the preconditioners P1αâ†'k and the comparison theorems about the in-fluence of the parameters on the convergence speed when the coefficient matrices are strictly diagonally dominant L-matrices. The obtained results not only indicate that pre-conditioned AOR methods are more effective and competitive with the bigger parameters, but also generalize those of the preconditioned Gauss-Seidel methods obtained by Li et al. Finally, we present a class of block preconditioners, the corresponding preconditioned block AOR and the multistage preconditioned block AOR methods, and analyze their convergence performances. The comparison results on the convergence performances are provided for the block AOR, the preconditioned block AOR and the corresponding multistage preconditioned block methods when the coefficient matrices are nonsingular Z-matrices and strictly diagonally dominant Z-matrices, respectively. The obtained re-sults indicate that the preconditioned block AOR methods can accelerate the convergence speed of the original block iterative methods, while the multistage preconditioned block AOR methods can successfully accelerate the convergence speed of the original iterative methods step by step.The correction of the obtained results is illustrated by numerical examples.
Keywords/Search Tags:extrapolated Gauss-Seidel iterative method, two-stage iterative method, preconditioned AOR iterative method, multistage preconditioner, convergence
PDF Full Text Request
Related items