Post on 06-Feb-2016
description
transcript
MA2213 Lecture 8
Eigenvectors
Application of Eigenvectors
11111110 nn
n
cccc
Vufoil 18, lecture 7 : The Fibonacci sequence satisfies
cccs nnn ,, 2
511
11
1110
1110
1110
2
1
12
1nn
n
n
n
n
ss
ss
ss
1154321 ,5,3,2,1 nnn ssssssss
111 limlim nn
nn
nn
n
n cccc
ss
Fibonacci Ratio Sequence
Fibonacci Ratio Sequence
Another Biomathematics ApplicationLeonardo da Pisa, better known as Fibonacci, invented his famous sequence to compute the reproductive success of rabbits* Similar sequences describe frequencies in males, females of a sex-linked gene. For genes (2 alleles) carried in the X chromosome**
,...2,1,0,10
121
21
2
1
nuu
uu
n
n
n
n
*page i, ** pages 10-12 in The Theory of Evolution and Dynamical Systems ,J. Hofbauer and K. Sigmund, 1984.
The solution has the form
nn vu ,
nn ccu )( 2
121
where )(),2( 0032
20031
1 vucvuc
Eigenvector Problem (pages 333-351)Recall that if
vis a square matrix then a nonzeroA
vector
if vAv is an eigenvector corresponding to the
Eigenvectors and eigenvalues arise in biomathematics where they describe growth and population genetics
eigenvalue
They arise in physical problems, especially those that involve vibrations in which eigenvalues are related to vibration frequencies
They arise in numerical solution of linear equations because they determine convergence properties
Example 7.2.1 pages 333-334For
)2(2
)1(1
2
1 vcvcxx
x
25.175.075.025.1
A
the eigenvalue-eigenvector pairs are
11
,2 )1(1 v
We observe that every (column) vector
and
11
,5.0 )2(2 v
where2/)( 211 xxc 2/)( 122 xxc
Example 7.2.1 pages 333-334Therefore, since x Ax is a linear transformation
)2(2
)1(1
)2(2
)1(1 )( AvcAvcvcvcAAx
and since )2()1( ,vv
We can repeat this process to obtain
)2(22
)1(11
)2(2
)1(1 vcvcAvcAvc
are eigenvectors
)2(222
)1(211
)2(22
)1(11
2 )( vcvcvcvcAxA )2(
21
2)1(
1)2(
22)1(
11 )(2 vcvcvcvcxA nnnnn
Question What happens as n ?
Example 7.2.1 pages 333-334General Principle : If a vector v can be expressed as a linear combination of eigenvectors of a matrix A, then it is very easy to compute Av
5015
J
It is possible to express every vector as a linear combination of eigenvectors of an n by n matrix A iff either of the following equivalent conditions is satisfied :(i) there exists a basis consisting of eigenvectors of A (ii) the sum of dimensions of eigenspaces of A = n
Question Does this condition hold for ?
Question What special form does this matrix have ?
Example 7.2.1 pages 333-334
5015
JThe characteristic polynomial of
is
2)5(50
15det)(det
zz
zJIz
2 is the (only) eigenvalue, it has algebraic multiplicity 2
055015
12
1
2
1
v
vv
vv so the eigenspace
for eigenvalue 5 has dimension 1
the eigenvalue 5 is said to have geometric multiplicity 1
Question What are alg.&geom. mult. in Example 7.2.7 ?
Characteristic Polynomials pp. 335-337
25.175.075.025.1
A
Example 7.22 (p. 335) The eigenvalue-eigenvector pairs
15.225.175.0
75.025.1det)(det 2
zzz
zAzI
Rvvv
,11
00
25.1275.075.025.12 )1(
)1(2
)1(1
of the matrix
in Example 7.2.1 are
}5.0,2{)( 21 Acorresponding eigenvectors
)2()1( ,vv
Question What is the equation for
)2(v?
Eigenvalues of Symmetric Matrices
25.175.075.025.1
The following real symmetric matrices that we studied
,1110
have real eigenvalues and eigenvectors corresponding
to distinct eigenvectors are orthogonal.
Question What are the eigenvalues of these matrices ? Question What are the corresponding eigenvectors ?
Question Compute their scalar products vuvu T),(
Eigenvalues of Symmetric MatricesTheorem 1. All eigenvalues of real symmetric matrices
Proof For a matrix M
are real valued. with complex (or real) entries
let M denote the matrix whose entries are thecomplex conjugates of the entries of M
Question Prove M is real (all entries are real) iff MM Question Prove 00 vvCv Tn
Assume that vAvCCvRA nnn ,,0,
,)( vvvvA TT
and observe that vvA therefore Avvvv TT
and Rvv T 0
Eigenvalues of Symmetric MatricesTheorem 2. Eigenvectors of a real symmetric matrix that
Proof Assume that
correspond to distinct eigenvalues are orthogonal.
Then compute
vwvwAvwvAwvAwvw TTTTTTT )(
,,,, RRA nn
and observe that
.,,, wAwvAvRwv n
.0),( vwvw T
Orthogonal Matrices
so
nRU Definition A matrix is orthogonal if IUU T If U
is orthogonal then
2)][det()det()det(det1 UUUI T therefore either 1)det( U or 1)det( UU is nonsingular and has an inverse 1U hence
TTTT UIUUUUUUUUIU )()( 1111
so .1 IUUUU T Examples
2cos2sin2sin2cos
,cossinsincos
Permutation MatricesDefinition A matrix matrix if there exists a function (called a permutation)
nnRM is called a permutation
that is 1-to-1 (and therefore onto) such that
},...,2,1{},...,2,1{: nnp
Examples
0)(,1 ,)(, jiipi MipjM
010001100
,100001010
,0110
,1001
Question Why is every permutation matrix orthogonal ?
Eigenvalues of Symmetric MatricesTheorem 7.2.4 pages 337-338 If
of
nnRA is symmetric
nthen there exists a set niv ii 1},,{ )(eigenvalue-eigenvector pairs nivAv i
ii 1,)()(
Proof Uses Theorems 1 and 2 and a little linear algebra.
Choose eigenvectors so that nivv iTi 1,1)( )()(
construct matrices nnn RvvvU )()2()1(
nn
n
RD
0
01 and observe that IUU T DUUA
TUDUUDUA 1
MATLAB EIG Command>> help eig EIG Eigenvalues and eigenvectors. E = EIG(X) is a vector containing the eigenvalues of a square matrix X. [V,D] = EIG(X) produces a diagonal matrix D of eigenvalues and a full matrix V whose columns are the corresponding eigenvectors so that X*V = V*D. [V,D] = EIG(X,'nobalance') performs the computation with balancing disabled, which sometimes gives more accurate results for certain problems with unusual scaling. If X is symmetric, EIG(X,'nobalance') is ignored since X is already balanced. E = EIG(A,B) is a vector containing the generalized eigenvalues of square matrices A and B. [V,D] = EIG(A,B) produces a diagonal matrix D of generalized eigenvalues and a full matrix V whose columns are the corresponding eigenvectors so that A*V = B*V*D. EIG(A,B,'chol') is the same as EIG(A,B) for symmetric A and symmetric positive definite B. It computes the generalized eigenvalues of A and B using the Cholesky factorization of B.EIG(A,B,'qz') ignores the symmetry of A and B and uses the QZ algorithm. In general, the two algorithms return the same result, however using the QZ algorithm may be more stable for certain problems. The flag is ignored when A and B are not symmetric. See also CONDEIG, EIGS.
MATLAB EIG Command
>> A = [-7 13 -16;13 -10 13;-16 13 -7]A = -7 13 -16 13 -10 13 -16 13 -7>> [U,D] = eig(A);>> UU = -0.5774 0.4082 0.7071 0.5774 0.8165 -0.0000 -0.5774 0.4082 -0.7071>> DD = -36.0000 0 0 0 3.0000 0 0 0 9.0000
Example 7.2.3 page 336
>> A*U
ans =
20.7846 1.2247 6.3640 -20.7846 2.4495 -0.0000 20.7846 1.2247 -6.3640
>> U*D
ans =
20.7846 1.2247 6.3640 -20.7846 2.4495 -0.0000 20.7846 1.2247 -6.3640
Positive Definite Symmetric MatricesTheorem 4 A symmetric matrix is [lec4,slide24]
(semi) positive definite iff all of its eigenvalues
nnRA
Proof Let be the orthogonal, diagonal
matrices on the previous page that satisfy
n
i iiTT uDuuAww
12Then for every ,nRw
TUDUA
nnRDU ,
where .wUu T Since TU is nonsingular 00 wutherefore A is (semi) positive definite iff
0)(01
2
n
i ii uu
Clearly this condition holds iff nii 1,0)(
0)(
Singular Value DecompositionTheorem 3 Ifexist orthogogonal matrices
nmRM
wheresuch that
and
nmRS nnmm RVRU ,
Singular Values
rM rank
= sqrt eig
TVSUM
000
00
001
r
MM T mmnn RVRU ,
has the form
j
then there
Proof Outline Chooseso
MVM TTVD and
UMM TTUE are diagonal, then
VMUS T satisfies
SDSSSSSSS TT )()( try to finish
MATLAB SVD Command>> help svd SVD Singular value decomposition.
[U,S,V] = SVD(X) produces a diagonal matrix S, of the same dimension as X and with nonnegative diagonal elements in decreasing order, and unitary matrices U and V so that X =U*S*V'. S = SVD(X) returns a vector containing the singular values. [U,S,V] = SVD(X,0) produces the "economy size“ decomposition. If X is m-by-n with m > n, then only the first n columns of U are computed and S is n-by-n. See also SVDS, GSVD.
MATLAB SVD Command
>> M = [ 0 1; 0.5 0.5 ]M = 0 1.0000 0.5000 0.5000>> [U,S,V] = svd(M)U = -0.8507 -0.5257 -0.5257 0.8507S = 1.1441 0 0 0.4370V = -0.2298 0.9732 -0.9732 -0.2298
>> U*S*V'
ans =
0.0000 1.0000 0.5000 0.5000
SVD Algebra
][ 21 vvV ][ 21 uuU
2
1
00
STVSUM
111
11 001
uUUSvUSVvM T
221
22
010
uUUSvUSVvM T
SVD Geometry
1v 2v
}1:{circle 22
212211 xxvxvx
SVD Geometry
11u 22u
}1:{ellipseM(circle) 22
22
21
21
2211 yyuyuy
Square RootsTheorem 5 A symmetric positive definite matrixhas a symmetric positive definite ‘square root’.
nnRA
Proof Let be the orthogonal, diagonal
matrices on the previous page that satisfyThen construct the matrices
TUDUA
nnRDU ,
n
S
0
01and observe that
TUSUB B
is symmetric positive definite
and satisfies
AUDUUUSUSUUSUB TTTT 22 ))((
Polar DecompositionTheorem 6 Every nonsingular matrix
can be factored as
nnRM
Proof Construct
where
and positive definite and
is symmetric and positive definite. Let
A
UBM is symmetric
TMMA
Bis orthogonal.
and observe that
U
B be symmetric
positive definite and satisfy AB 2and construct
.1MBU MBMUU TT 2Then
IMMMMMAM TTT 11 )(and clearly .UBM
Löwdin Orthonormalization
http://www.quantum-chemistry-history.com/Lowdin1.htm
Proof Start with
(1) Per-Olov Löwdin, On the Non-Orthogonality Problem Connected with the use of Atomic Wave Functions in the Theory of Molecules and Crystals, J. Chem. Phys. 18, 367-370 (1950).
in an inner product space
njivvG jiij ,1),,(
nvvv ,...,, 21(assumed to be linearly independent), compute the
Gramm matrix
Since G is symmetric and positive definite, Theorem 5
gives (and provides a method to compute) a matrix
B that is symmetric and positive definite and .2 GB Then nivBu jji
n
ji 1,)( ,1
1 are orthonormal.
The Power Method pages 340-345Finds the eigenvalue with largest absolute value of amatrix nnRA whose eigenvalues satisfy
Step 1 Compute a vector with random entries
|||||| 21 n )0(z
Step 2 Compute )0()1( zAw
Step 3 Compute ||||/ )1()1()1( wwz( recall that ||max||| )1(
1
)1(iniww
)
Step 4 Compute )1()2( zAw
and)0()1()1(
1 / kk zw
and
Then ),(),( )1(1
)()(1 vw mm with .)1(
1)1( vvA
||maxarg )0(
1i
nizk
and
and ||maxarg )1(
1i
nizk
)1()2()2(1 / kk zwRepeat
The Inverse Power MethodResult If v is an eigevector of
nnRA corresponding to eigenvalue R and then
.1A
vis an eigenvector of IA corresponding to
eigenvalue . Furthermore, if 0 then
v is an eigenvector of 1)( IA eigenvalue .)( 1
corresponding to
Definition The inverse power method is the powermethod applied to the matrixIt can find the eigenvalue-eigenvector pair if there is one eigenvalue that has smallest absolute value.
Inverse Power Method With ShiftsComputes eigenvalue
Step 1 Apply 1 or more interations of the power method
R1
Step 2 Compute
using the matrix
and iterate. Then
11 )( IA to estimate an eigenvalue
- eigenvector pair 11
11 ,)( v
closest toof Aand a corresponding eigenvector
1112 - better estimate of
Step 3 Apply 1 or more interations of the power method
using the matrix 12 )( IA to estimate an eigenvalue
- eigenvector pair 21
22 ,)( v vvkk ,
v
with cubic rate of convergence !
Unitary and Hermitian MatricesDefinition The adjoint of a matrix
nmCM is the matrix
TMM
Example
73240
51
7245301
ii
ii
Definition A matrix nnCU is unitary if 1 UUDefinition A matrix nnCH is hermitian if HH
Super Theorem : All previous theorems true for complex matrices if orthogonal is replaced by unitary, symmetric by hermitian, and old with new (semi) positive definite.
Definition A matrix nnCP is (semi) positive definite
if 0)(0 Pvvv (or self-adjoint)
Homework Due Tutorial 5 (Week 11, 29 Oct – 2 Nov)
1. Do Problem 1 on page 348.
2. Read Convergence of the Power Method (pages 342-346) and do Problem 16 on page 350.
5.05.010
M
3. Do problem 19 on pages 350-351.
4. Estimate eigenvalue-eigenvector pairs of thematrix M using the power and inverse power methods – use 4 iterations and compute errors
5. Compute the eigenvalue-eigenvector pairs of the orthogonal matrix O
2cos2sin2sin2cos
O
6. Prove that the vectors),( ji uu
niui 1, defined at the bottom of slide 29 are orthonormal by computing their inner products
Extra Fun and Adventure
We have discussed several matrix decompositions :
LU Eigenvector PolarSingular Value
Find out about other matrix decompositions. How arethey derived / computed ? What are their applications ?