6 CHAPTER 2. ITERATIVE METHODS
– SSOR.
Symmetric Successive Overrelaxation (SSOR) has no advantage over SOR as a stand-
alone iterative method; however, it is useful as a preconditioner for nonstationary meth-
ods.
• Nonstationary Methods
– Conjugate Gradient (CG).
The conjugate gradient method derives its name from the fact that it generates a se-
quence of conjugate (or orthogonal) vectors. These vectors are the residuals of the iter-
ates. They are also the gradients of a quadratic functional, the minimization of which
is equivalent to solving the linear system. CG is an extremely effective method when
the coefficient matrix is symmetric positive definite, since storage for only a limited
number of vectors is required.
– Minimum Residual (MINRES) and Symmetric LQ (SYMMLQ).
These methods are computational alternatives for CG for coefficient matrices that are
symmetric but possibly indefinite. SYMMLQ will generate the same solution iterates
as CG if the coefficient matrix is symmetric positive definite.
– Conjugate Gradient on the Normal Equations: CGNE and CGNR.
These methods are based on the application of the CG method to one of two forms of
the normal equations for Ax = b. CGNE solves the system (AA
T
)y = b for y and then
computes the solution x = A
T
y. CGNR solves (A
T
A)x =
˜
b for the solution vector
x where
˜
b = A
T
b. When the coefficient matrix A is nonsymmetric and nonsingular,
the normal equations matrices AA
T
and A
T
A will be symmetric and positive definite,
and hence CG can be applied. The convergence may be slow, since the spectrum of the
normal equations matrices will be less favorable than the spectrum of A.
– Generalized Minimal Residual (GMRES).
The Generalized Minimal Residual method computes a sequence of orthogonal vectors
(like MINRES), and combines these through a least-squares solve and update. How-
ever, unlike MINRES (and CG) it requires storing the whole sequence, so that a large
amount of storage is needed. For this reason, restarted versions of this method are used.
In restarted versions, computation and storage costs are limited by specifying a fixed
number of vectors to be generated. This method is useful for general nonsymmetric
matrices.
– BiConjugate Gradient (BiCG).
The Biconjugate Gradient method generates two CG-like sequences of vectors, one
based on a system with the original coefficient matrix A, and one on A
T
. Instead of
orthogonalizing each sequence, they are made mutually orthogonal, or “bi-orthogonal”.
This method, like CG, uses limited storage. It is useful when the matrix is nonsymmet-
ric and nonsingular; however, convergence may be irregular, and there is a possibility
that the method will break down. BiCG requires a multiplication with the coefficient
matrix and with its transpose at each iteration.
– Quasi-Minimal Residual (QMR).
The Quasi-Minimal Residual method applies a least-squares solve and update to the
BiCG residuals, thereby smoothing out the irregular convergence behavior of BiCG.
Also, QMR largely avoids the breakdown that can occur in BiCG. On the other hand,
it does not effect a true minimization of either the error or the residual, and while it
converges smoothly, it does not essentially improve on the BiCG.
– Conjugate Gradient Squared (CGS).