Date post: | 26-Jun-2015 |
Category: |
Technology |
Upload: | matthew-leingang |
View: | 1,530 times |
Download: | 0 times |
Lesson 31First Order, Higher Dimensional Difference
Equations
Math 20
April 30, 2007
AnnouncementsI PS 12 due Wednesday, May 2I MT III Friday, May 4 in SC Hall AI Final Exam: Friday, May 25 at 9:15am, Boylston 110 (Fong
Auditorium)
Recap
Higher dimensional linear systemsExamples
Markov ChainsPopulation Dynamics
Solution
Qualitative AnalysisDiagonal systemsExamples
Higher dimensional nonlinear
one-dimensional linear difference equations
FactThe solution to the inhomogeneous difference equation
yk+1 = ayk +b
(with a 6= 1) has solution
yk = ak(
y0−b
1−a
)+
b1−a
Please try not to memorize this. When a and b have actualvalues, it’s either to follow this process:
1. Start with ak times an undetermined parameter c (thissatisfies the homogenized equation)
2. Find the equilibrium value y∗.3. Add the two and pick c to match y0 when k = 0.
Nonlinear equations
FactThe equilibriumpoint y∗ of thenonlineardifferenceequationyk+1 = g(yk ) isstable if|g′(yk )|< 1.
slope
=1
slope = g ′(y∗ )
slope=−1
y0
y1
y2
Recap
Higher dimensional linear systemsExamples
Markov ChainsPopulation Dynamics
Solution
Qualitative AnalysisDiagonal systemsExamples
Higher dimensional nonlinear
Let’s kick it up a notch and look at the multivariable, linear,homogeneous difference equation
y(k +1) = Ay(k)
(we move the index into parentheses to allow y(k) to havecoordinates and to avoid writing yk ,i .)
Skipping class
ExampleThis example was a Markov chain with transition matrix
A =
[0.7 0.80.3 0.2
]Then the probability of going or skipping on day k satisfies theequation
p(k +1) = Ap(k)
ExampleFemale lobsters have more eggs each season the longer theylive. For this reason, it is illegal to keep a lobster that has laideggs.Let yi be the number of lobsters in a fishery which are i yearsalive. Then the difference equation might have the simplifiedform
y(k +1) =
0 100 400 700
0.1 0 0 00 0.3 0 00 0 0.9 0
y(k)
Mmmm. . . Lobster
Formal solution
y(1) = Ay(0)
y(2) = Ay(1) = A2y(0)
y(3) = Ay(2) = A3y(0)
So
FactThe solution to the homogeneous system of linear differenceequations y(k +1) = Ay(k) is
y(k) = Aky(0)
Formal solution
y(1) = Ay(0)
y(2) = Ay(1) = A2y(0)
y(3) = Ay(2) = A3y(0)
So
FactThe solution to the homogeneous system of linear differenceequations y(k +1) = Ay(k) is
y(k) = Aky(0)
Flop count
I To multiply two n×n matrices takes n3(n−1) additions ormultiplications (flop=floating point operation)
I So finding Ak takes about n4k flops!
Flop count
I To multiply two n×n matrices takes n3(n−1) additions ormultiplications (flop=floating point operation)
I So finding Ak takes about n4k flops!
Now what?Suppose v is an eigenvector of A with eigenvalue λ . Then thesolution to the problem
y(k +1) = Ay(k), y(0) = v
is
y(k) = λkv
Supposey(0) = c1v1 +c2v2 + · · ·+cmvm
Then
Ay(0) = c1λ1v1 +c2λ2v2 + · · ·+cmλmvm
A2y(0) = c1λ21 v1 +c2λ
22 v2 + · · ·+cmλ
2mvm
If A is diagonalizable, we can take m = n and write any initialvector as a linear combination of eigenvalues.
Now what?Suppose v is an eigenvector of A with eigenvalue λ . Then thesolution to the problem
y(k +1) = Ay(k), y(0) = v
isy(k) = λ
kv
Supposey(0) = c1v1 +c2v2 + · · ·+cmvm
Then
Ay(0) = c1λ1v1 +c2λ2v2 + · · ·+cmλmvm
A2y(0) = c1λ21 v1 +c2λ
22 v2 + · · ·+cmλ
2mvm
If A is diagonalizable, we can take m = n and write any initialvector as a linear combination of eigenvalues.
Now what?Suppose v is an eigenvector of A with eigenvalue λ . Then thesolution to the problem
y(k +1) = Ay(k), y(0) = v
isy(k) = λ
kv
Supposey(0) = c1v1 +c2v2 + · · ·+cmvm
Then
Ay(0) = c1λ1v1 +c2λ2v2 + · · ·+cmλmvm
A2y(0) = c1λ21 v1 +c2λ
22 v2 + · · ·+cmλ
2mvm
If A is diagonalizable, we can take m = n and write any initialvector as a linear combination of eigenvalues.
Now what?Suppose v is an eigenvector of A with eigenvalue λ . Then thesolution to the problem
y(k +1) = Ay(k), y(0) = v
isy(k) = λ
kv
Supposey(0) = c1v1 +c2v2 + · · ·+cmvm
Then
Ay(0) = c1λ1v1 +c2λ2v2 + · · ·+cmλmvm
A2y(0) = c1λ21 v1 +c2λ
22 v2 + · · ·+cmλ
2mvm
If A is diagonalizable, we can take m = n and write any initialvector as a linear combination of eigenvalues.
Now what?Suppose v is an eigenvector of A with eigenvalue λ . Then thesolution to the problem
y(k +1) = Ay(k), y(0) = v
isy(k) = λ
kv
Supposey(0) = c1v1 +c2v2 + · · ·+cmvm
Then
Ay(0) = c1λ1v1 +c2λ2v2 + · · ·+cmλmvm
A2y(0) = c1λ21 v1 +c2λ
22 v2 + · · ·+cmλ
2mvm
If A is diagonalizable, we can take m = n and write any initialvector as a linear combination of eigenvalues.
The big picture
FactLet A have a complete system of eigenvalues and eigenvectorsλ1,λ2, . . . ,λn and v1,v2, . . . ,vn. Then the solution to thedifference equation y(k +1) = Ay(k) is
y(k) = Aky(0) = c1λk1 v1 +c2λ
k2 v2 + · · ·+cnλ
kn vn
where c1,c2, . . . ,cn are chosen to make
y(0) = c1v1 +c2v2 + · · ·+cnvn
Recap
Higher dimensional linear systemsExamples
Markov ChainsPopulation Dynamics
Solution
Qualitative AnalysisDiagonal systemsExamples
Higher dimensional nonlinear
Iterating diagonal systems
Consider a 2×2 matrix of the form
D =
[λ1 00 λ2
]Then the λ ’s tell the behavior of the system.
Picture in terms of eigenvalues
I λ1 > λ2 > 1: repulsion away from the origin
I 1 > λ1 > λ2 > 0: attraction to the originI λ1 > 1 > λ2: saddle point
For negative eigenvalues just square them and use the aboveresults.
Picture in terms of eigenvalues
I λ1 > λ2 > 1: repulsion away from the originI 1 > λ1 > λ2 > 0: attraction to the origin
I λ1 > 1 > λ2: saddle point
For negative eigenvalues just square them and use the aboveresults.
Picture in terms of eigenvalues
I λ1 > λ2 > 1: repulsion away from the originI 1 > λ1 > λ2 > 0: attraction to the originI λ1 > 1 > λ2: saddle point
For negative eigenvalues just square them and use the aboveresults.
Picture in terms of eigenvalues
I λ1 > λ2 > 1: repulsion away from the originI 1 > λ1 > λ2 > 0: attraction to the originI λ1 > 1 > λ2: saddle point
For negative eigenvalues just square them and use the aboveresults.
Picture in terms of eigenvalues
I λ1 > λ2 > 1: repulsion away from the originI 1 > λ1 > λ2 > 0: attraction to the originI λ1 > 1 > λ2: saddle point
For negative eigenvalues just square them and use the aboveresults.
Back to skipping class
ExampleIf
A =
[0.7 0.80.3 0.2
]
The eigenvectors (in decreasing order of absolute value) are[8/113/11
]with eigenvalue 1 and
[−1
212
]with eigenvalue − 1
10 . So the
system converges to a multiple of[
8/113/11
].
Back to skipping class
ExampleIf
A =
[0.7 0.80.3 0.2
]The eigenvectors (in decreasing order of absolute value) are[
8/113/11
]with eigenvalue 1 and
[−1
212
]with eigenvalue − 1
10 .
So the
system converges to a multiple of[
8/113/11
].
Back to skipping class
ExampleIf
A =
[0.7 0.80.3 0.2
]The eigenvectors (in decreasing order of absolute value) are[
8/113/11
]with eigenvalue 1 and
[−1
212
]with eigenvalue − 1
10 . So the
system converges to a multiple of[
8/113/11
].
Back to the lobsters
We had
A =
0 100 400 700
0.1 0 0 00 0.3 0 00 0 0.9 0
The eigenvalues are 3.80293,−2.84895,−0.476993+1.23164i ,−0.476993−1.23164i and the first eigenvector is[0.999716 0.0233099 0.00489153
]T
The population will grow despite the increased harvesting!
Back to the lobsters
We had
A =
0 100 400 700
0.1 0 0 00 0.3 0 00 0 0.9 0
The eigenvalues are 3.80293,−2.84895,−0.476993+1.23164i ,−0.476993−1.23164i and the first eigenvector is[0.999716 0.0233099 0.00489153
]TThe population will grow despite the increased harvesting!
Recap
Higher dimensional linear systemsExamples
Markov ChainsPopulation Dynamics
Solution
Qualitative AnalysisDiagonal systemsExamples
Higher dimensional nonlinear
The nonlinear case
Consider now the nonlinear system
y(k +1) = g(y(k)).
The process is as it was with the one-dimensional nonlinear:1. Look for equilibria y∗ with g(y∗) = y∗2. Linearize about the equilibrium using the matrix
A = Dg(y∗) =
(∂gi
∂yj
)3. The eigenvalues of A determine the stability of y∗.