+ All Categories
Home > Documents > Krylov-Subspace Methods - II

Krylov-Subspace Methods - II

Date post: 13-Jan-2016
Category:
Upload: barr
View: 42 times
Download: 0 times
Share this document with a friend
Description:
Krylov-Subspace Methods - II. Lecture 7 Alessandra Nardi. Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy. Last lectures review. Overview of Iterative Methods to solve Mx=b Stationary Non Stationary QR factorization Modified Gram-Schmidt Algorithm - PowerPoint PPT Presentation
31
Krylov-Subspace Methods - II Lecture 7 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy
Transcript
Page 1: Krylov-Subspace Methods - II

Krylov-Subspace Methods - II

Lecture 7

Alessandra Nardi

Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy

Page 2: Krylov-Subspace Methods - II

Last lectures review• Overview of Iterative Methods to solve Mx=b

– Stationary– Non Stationary

• QR factorization– Modified Gram-Schmidt Algorithm– Minimization View of QR

• General Subspace Minimization Algorithm• Generalized Conjugate Residual Algorithm

– Krylov-subspace – Simplification in the symmetric case– Convergence properties

• Eigenvalue and Eigenvector Review– Norms and Spectral Radius– Spectral Mapping Theorem

Page 3: Krylov-Subspace Methods - II

Approximately Solve Mx b

10 1Approximate as a weighted sum of , ,...,k

kx w w w

1 1 0

0

kk k

i ii

r b Mx r Mw

1

0

k

ki i

i

x w

Residual Minimizing idea: pick ' to minimizei s

2

21 1 1 0

22

kTk k k

i ii o

r r r r Mw

Arbitrary Subspace MethodsResidual Minimization

Page 4: Krylov-Subspace Methods - II

221 0

20 2

Minimizing is easy ifk

ki i

i

r r Mw

0 1 0 1, ,..., , ,...,k kspan p p p span w w w

0 1Create a set of vectors , ,..., such thatkp p p

0 or is orthogonal to T

i j i jMw Mw Mw Mw

and 0T

i jMp Mp

Use Gram-Schmidt on Mwi’s!

Arbitrary Subspace MethodsResidual Minimization

Page 5: Krylov-Subspace Methods - II

1 0 0

0

kk i

i ki

x M r M r

1 0 1 0 0

0

kk i

i ki

r r M r I M M r

kth order polynomial

Krylov Subspace Methods

k

iii

k wx0

1

},...,,{},..., 121 bMMbbspanw,wspan{w k

k

Krylov Subspace

Page 6: Krylov-Subspace Methods - II

Krylov Subspace MethodsSubspace Generation

0Note: for any 0

0 1 0 0 0 00span , = span ,r r r Mr r Mr

The set of residuals also can be used as a representation of the Krylov-Subspace

Generalized Conjugate Residual AlgorithmNice because the residuals generate next search directions

Page 7: Krylov-Subspace Methods - II

1

11

0

Tkkjk

k jTj

j j

Mr Mpp r p

Mp Mp

Tkk

k T

k k

r Mp

Mp Mp

1k kk kx x p

1k kk kr r Mp

Determine optimal stepsize in kth search direction

Update the solution (trying to

minimize residual) and the residual

Compute the new orthogonalized search direction (by using the most

recent residual)

Krylov-Subspace MethodsGeneralized Conjugate Residual Method

(k-th step)

Page 8: Krylov-Subspace Methods - II

1

11

0

Tkkjk

k jTj

j j

Mr Mpp r p

Mp Mp

Tkk

k T

k k

r Mp

Mp Mp

1k kk kx x p

1k kk kr r Mp

Vector inner products, O(n)Matrix-vector product, O(n) if sparse

Vector Adds, O(n)

O(k) inner products, total cost O(nk)

If M is sparse, as k (# of iters) approaches n,3total cost ( ) (2 ) .... ( ) ( )O n O n O kn O n

Better Converge Fast!

Krylov-Subspace MethodsGeneralized Conjugate Residual Method

(Computational Complexity for k-th step)

Page 9: Krylov-Subspace Methods - II

Summary

• What is an iterative non stationary method: x(k+1) =x(k)+akpk

• How search to calculate:– Search directions (pk)– Step along search directions (ak)

• Krylov Subspace GCR • GCR is O(k2n)

– Better converge fast!

Now look at convergence properties of GCR

Page 10: Krylov-Subspace Methods - II

0 jIf for all in GCR, thenj k 0 0

0 1span , ,..., span , ,...,1) kkp p p r Mr Mr

th1 02) is the k ( ) , o rderkk kx M r

21

2polynomial which minimizes kr

1 1 0 03) ( )k kkr b Mx r M M r

0 01( )k kI M M r M r

01where is the order poly 1

th

k M r k 21

12minimizing subject to 0 = 1 k

kr

Krylov Methods Convergence AnalysisBasic properties

Page 11: Krylov-Subspace Methods - II

GCR Optimality Property

k+1 polynomial such that 0 =1

ThereforeAny polynomial which satisfies the

constraints can be used to get an upper bound on

1

0

kr

r

1 0k+1 k+1( ) r where is any ordertk hkr M

Krylov Methods Convergence AnalysisOptimality of GCR poly

Page 12: Krylov-Subspace Methods - II

Theorem: Any induced norm is a bound on the spectral radius

max1 ll

l

Mxx

M

Proof:First pick , 1i i l

x u u

i i i i i il l lMu u u

Eigenvalues and eigenvectors reviewInduced norms

Page 13: Krylov-Subspace Methods - II

Given a polynomial

0 1p

pf x a a x a x

Apply the polynomial to a matrix

0 1p

pf M a a M a M

Then

spectrum f M f spectrum M

Useful Eigenproperties Spectral Mapping Theorem

Page 14: Krylov-Subspace Methods - II

Krylov Methods Convergence AnalysisOverview

where is any (k+1)-th order polynomial

subject to:

may be used to get an upper bound on

)(~1 Mk

0

1

r

r k

1)0(~1 k

01

01

101

1 )(~)()( rMrMrrMr kkk

kk

)(~1 Mk

Matrix norm property GCR optimality property

Page 15: Krylov-Subspace Methods - II

• Review on eigenvalues and eigenvectors– Induced norms: relate matrix eigenvalues to the

matrix norms– Spectral mapping theorem: relate matrix eigenvalues

to matrix polynomials

• Now ready to relate the convergence properties of Krylov Subspace methods to eigenvalues of M

Krylov Methods Convergence AnalysisOverview

Page 16: Krylov-Subspace Methods - II

1

1

1 1

condition number of M's eigen space

k

n n

k n

v v v v

Cond(V)

1

1

1 1

eigenvectors of M

( )k

n n

k n

k v v v vM

Krylov Methods Convergence AnalysisNorm of matrix polynomials

Page 17: Krylov-Subspace Methods - II

12

1

2

maxk

k i ixi

k n

x

max i k i

( ) max( ) i k ik cond VM

Krylov Methods Convergence AnalysisNorm of matrix polynomials

Page 18: Krylov-Subspace Methods - II

1) The GCR Algorithm converges to the exact solution in at most n steps

where . Then, max = 0, i i n iM

0 and therefore 0nn M r

2) If M has only q distinct eigenvalues, the GCR Algorithm converges in at most q steps

1 2Proof: Let = ... q qx x x x

1 2Proof: Let = ... n nx x x x

Krylov Methods Convergence AnalysisImportant observations

Page 19: Krylov-Subspace Methods - II

1

1 1 1( ) n nv v v vcond V

If M = MT then

2) M has real eigenvalues

( ) maxk i k iM

1) M has orthonormal eigenvectors

If M is postive definite, then > 0M

Krylov Methods Convergence AnalysisConvergence for MT=M - Residual Polynomial

Page 20: Krylov-Subspace Methods - II

* = evals(M)- = 5th order poly- = 8th order poly

1

Krylov Methods Convergence AnalysisResidual Polynomial Picture (n=10)

Page 21: Krylov-Subspace Methods - II

Keep as small as possible:k iStrategically place zeros of the poly

Krylov Methods Convergence AnalysisResidual Polynomial Picture (n=10)

Page 22: Krylov-Subspace Methods - II

Then a good polynomial ( is small) kp Acan be found by solving the min-max problem

min max,. .

0 1

min max

k

kth order kxpolys s tp

p x

The min-max problem is exactly solved by Chebyshev Polynomials

min max minConsider , , > 0 M

Krylov Methods Convergence AnalysisConvergence for MT=M – Polynomial min-max problem

Page 23: Krylov-Subspace Methods - II

1 cos cos 1,1kC x k x x

min max,. .

0 1

min max

k

kth order kxpolys s t

x

=

The Chebyshev Polynomial

min max

min

max min,

min

max min

1 2

max

1 2

k

x

k

xC

C

Krylov Methods Convergence AnalysisConvergence for MT=M – Chebyshev solves min-max

Page 24: Krylov-Subspace Methods - II

Chebychev Polynomials minimizing over [1,10]

Page 25: Krylov-Subspace Methods - II

min max,. .

0 1

min max

k

kth order kxpolys s t

x

max

min

max

min

1

2

1

k

max

max min

1

1 2kC

Krylov Methods Convergence AnalysisConvergence for MT=M – Chebyshev bounds

Page 26: Krylov-Subspace Methods - II

min max minIf , , > 0 M

max

0min

max

min

1

2

1

k

kr r

Krylov Methods Convergence AnalysisConvergence for MT=M – Chebyshev result

Page 27: Krylov-Subspace Methods - II

1 0 0

0 1

0

0 0 1

1 0 0

0 2

0

0 0 N

For which problem will GCR Converge Faster?

Krylov Methods Convergence AnalysisExamples

Page 28: Krylov-Subspace Methods - II

0

kr

r

Iteration

Which Convergence Curve is GCR?

Page 29: Krylov-Subspace Methods - II

GCR Algorithm can eliminate outlying

eigenvalues by placing polynomial

zeros directly on them.

Krylov Methods Convergence AnalysisChebyshev is a bound

Page 30: Krylov-Subspace Methods - II

Iterative Methods - CG

kkTk

kTkk

k

kT

k

kTk

k

kkkk

kkkk

drr

rrrd

Mdd

rr

Mdrr

dxx

)()(

)1()1()1(

1

)()(

)()1(

)()1(

)(

)(

)(

Convergence is related to:– Number of distinct eigenvalues– Ratio between max and min eigenvalue

Why ?How?

Now we know

Page 31: Krylov-Subspace Methods - II

• Reminder about GCR– Residual minimizing solution– Krylov Subspace– Polynomial Connection

• Review Eigenvalues– Induced Norms bound Spectral Radius– Spectral mapping theorem

• Estimating Convergence Rate– Chebyshev Polynomials

Summary


Recommended