Font Size: a A A

Analysis Of Convergence Of And Error Bound For PSD Iterative Method

Posted on:2007-06-01Degree:MasterType:Thesis
Country:ChinaCandidate:X M LinFull Text:PDF
GTID:2120360185458627Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
How to solve the large linear equations is the core of the large-scale science and engineering project computation.Many authors have studied it (see[18]-[29]). With the rapid development of computer, the scale of the problem becomes larger and larger. Direct method has been substituted by iterative method and iterative method has become one kind of the important methods for solving large liner equations. Usually, we can not get the exact solution of linear equations by way of finite arithmetics operations by iterative method but can gradually approach it. Although every time the arithmetic operation is accurate, we only get the approximate solution by iterative method. So, iterative method concerns the problem of convergence and error bound.The thesis deals mainly with the convergence and error bound of the preconditioned simultaneous displacement method (PSD method). Many authors have researched the convergence of PSD method (see [1]-[13]). On the other hand, in Chapter 2, we point out the error of theorem 3.3 which written by D.J.Evans and N.M.Missirlis in article [1]. At the same time, a sufficient condition for convergence of the PSD method is given to be compared when the coefficient matrix A of the linear system Ax = b is a symmetric, positively defective matrix. In §3.2, an example is given to state that the range of our sufficient condition is wider than theorem 3.3 of article [1]. On the other hand, following a.n analogous approach of [14] and starting the functional relationshipwe have a perfect analysis for the PSD method to converge and optimum valves for the involved parameters under different conditions. (l)Under the assumptions that A is a consistent ordered matrix with nonvanishing diagonal elements and the eigenvalues of the Jacobi matrix of A are real,we get necessary and sufficient conditions for the PSD method to convergence.The result is equal to theorem 1 of article [9].Under the same condition, we can see the optimal parameter and of corresponding spectral radius of thePSD method in [8]:(2)When A is a consistent ordered matrix with nonvanishing diagonal elements and the eigenvalues of the Jacobi matrix of A are imaginary or zero,we get necessary and sufficient conditions for the PSD method to convergence.In chapter 3, the optimal parameter and of corresponding spectral radius of the PSD method are given by table 3.3. Moreover, under the assumption 0 < a < av^2 + a2,2 21 - aa Va2 - a2Popt ~PSD method is the optimal extrapolation of the SSOR method, otherwise its convergence rate is the same to the SSOR method.On the increasing of the rank of A, it is difficult to get the exact solution of linear equation Ax =■ b directly. So, we think it is necessary to obtain the error bound to judge per iteration. In chapter 4, under the assumptions of the coefficient matrix A of the linear system is a symmetric positively defective and consistently ordered matrix, we get a error bound for the PSD iterative method and the error bound is dependent on the inner product of vector. An example is given for illustrate the effect and practicability of the error bound.
Keywords/Search Tags:PSD iterative method, symmetric positively defective matrix, convergence, optimal parameter, optimal spectral
PDF Full Text Request
Related items