Font Size: a A A

A Study Of Accelerated Technology Of Relaxation Iterative Algorithm

Posted on:2020-03-06Degree:MasterType:Thesis
Country:ChinaCandidate:Q KongFull Text:PDF
GTID:2370330596975275Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Elliptic partial differential equations are encountered in many areas such as mathematics,physics and engineers.With solving series of linear system of equations obtained by discretization of elliptic partial differential equations using second-order finite-difference method,many iterative methods are becoming widely used nowadays.Projection methods such as the Conjugate Gradient(CG)method,the Generalized Minimum Residual(GMRES)method and etc.need effective preconditioner with restrictions in self-adaptive mesh.Therefore,some basic iterative methods such as the Jacobi method,Richardson method,the Successive Over-Relaxation(SOR)method and etc.are taken into consideration to solve these linear system of equations.With the advantages of Jacobi in simplicity and potentiality for accelerating large-scale parallel simulations,studies about acceleration of the Jacobi method emerge one after another in recent years.In these studies,there are two main ways to accelerate the Jacobi method.The first one is utilizing some relaxation parameters control corresponding residual such as the Scheduled Relaxation Jacobi(SRJ)method,the Delayed Over-Relaxation(DOR)method and etc.Another one is adding Anderson acceleration or Anderson mixing in the Jacobi iteration,such as the Anderson Jacobi method and the Alternating Anderson Jacobi(AAJ)method.Based on the accelerated variances of Jacobi,accelerated algorithms of the SRJ method are researched in this dissertation,including convergent performance and computing time of algorithms.The contents are divided into three parts shown as follows.Firstly,we introduce four main kinds of accelerated algorithms of Jacobi,including the Weighted Jacobi method,the SRJ method,the Anderson Jacobi method and the Alternating Anderson Jacobi method.Secondly,we employ Anderson acceleration and Minimum Residual algorithm to improve convergence rate of Jacobi in each iteration cycle of SRJ,producing the Alternating Anderson Scheduled Relaxation Jacobi(AASRJ)method and the Minimum Residual Scheduled Relaxation Jacobi(MRSRJ)method respectively.Furthermore,corresponding calculation procedures are shown in this paper.Finally,we compare convergent performance and CPU time of SRJ,AAJ,AASRJ,MRSRJ and optimal SRJ,i.e.the Chebyshev Jacobi(CJ)method by numerical experiments in three kinds of problems,including Laplace model problem,variable coefficient problem and radiation diffusion model problem.And a summary of convergent properties of AASRJ and MRSRJ are putted forward while solving large,linear algebraic systems.Furthermore,we validate the high efficiency of AASRJ algorithm,that is,AASRJ is competitive with CJ.AASRJ and MRSRJ have a better convergence performance than SRJ in all cases.The advantages in terms of simplicity and efficiency of the AASRJ and MRSRJ methods make them attractive alternatives for large,sparse linear system of equations.
Keywords/Search Tags:the Jacobi method, Anderson acceleration, Minimum Residual algorithm, the Alternating Anderson Jacobi method, the Scheduled Relaxation Jacobi method
PDF Full Text Request
Related items