+ All Categories
Home > Documents > Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Date post: 05-Jan-2016
Category:
Upload: prosper-dean
View: 219 times
Download: 1 times
Share this document with a friend
32
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2 < . . . < t k < t k+1 k k k k k k k k x t X x t X P x t X x t X x t X P | , . . ,. | 1 1 1 1 1 1 If X(t) is discrete-valued k k k k k k x t X b t X a P x t X x t X b t X a P | , . . ,. | 1 1 1 1 If X(t) is continuous-valued i.e. The future of the process depends only on the present and not on the past.
Transcript
Page 1: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < . . . < tk < tk+1

kkkk

kkkk

xtXxtXP

xtXxtXxtXP

|

,..,.|

11

1111

If X(t) is discrete-valued

kkk

kkk

xtXbtXaP

xtXxtXbtXaP

|

,..,.|

1

111

If X(t) is continuous-valued

i.e. The future of the process depends only on the present and not on the past.

Page 2: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Examples:

kkkk

kkkk

kkkk

nnnn

nnnn

nnnn

xtN|xtNP

ttxxP

xtN,...,xtN|xtNP

)

sS|sSP

sS,...,sS|sSP

XSX...XXS)

11

11

1111

11

1111

121

seconds -in events

ProcessPoisson 2

Clearly

1

Page 3: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

41111

2101102

1

41000

1210Clearly

Markov? Is

21Bernoulli

2

1)3

1

11

1

1

nnn

nnnnn

nnn

n

n

knnn

X,XPYP

X,XPX,XPYP

X,XPYP

,,Y

Y

p~XXXY

0 needs1given21 and

11 needs1 Since

41

218

1

21

011

21

211

211

211

1

21

1

1

1

nnn

nnn

nnn

n

nn

nn

XXY

X,XY

X,X,XP

YP

Y,YPY|YP

Page 4: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Markovnot is

12112

11

impossible is1211, sequence thesince

012

1

1211

1211

211

21

21

21

n

nnnnn

nn

nnn

nnn

Y

Y,Y|YPY|YP

,

Y,YP

Y,Y,YPY,Y|YP

But

Page 5: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Markov Chains

Integer-values Markov Processes are called Markov Chains.

k

iiiii

kk

xtXxtXPxtXP

xtXxtXP

11111

1111

|

.,...,

Examples:

• Sum process.• Counting process• Random walk• Poisson process

Markov chains can be discrete-time or continuous-time.

Page 6: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Discrete – Time Markov Chains

Initial PMF : pj (0) P( X0 = j ) ; j = 0,1, . . .

Transition Probability Matrix:

1 Clearly

transitionstate initial

sequenceany

0

iesprobabilitn transitiohomogenous indicates

1

010

00

1

jij

k

n

kk,ki

nn

nnij

p

kPP

P.e.i

Pp

iX,...,iXP

niX|jXPp

Page 7: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

e.g. Binomial counting process : Sn = Sn-1 + Xn Xn ~ Bernoulli

0 1 2 k k+1p p p p p p

1-p 1-p 1-p 1-p 1-p

....

....100

....010

....001

pp

pp

pp

P

Page 8: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

n-step Transition Probabilities:

invariant- timeis sincegeneralIn

TPM where2 TPM step-2

s' possible all through i.e.2

0

0

2

0112

0

00112

0

012012

PPnP

PPP

kppp

pp

iX|,kXPkX|jXP

iXP

iXPiX|,kXPkX|jXP

iXP

iX,kX,jXPiX|kX,jXP

j,i

niX|jXPnp

n

kikkjij

ikkj

knkij

i j k

Time 0 Time 1 Time 2

Page 9: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

nPn

isPPn-n

n....npnpn

n

0

vectorrow

TPM1Then

stepat PMF StateDenote 10

pp

pp

p

PMF at any time can be obtained from initial PMF and the TPM.

1

stepat statein Prob

11

npp

iXPiX|jXPnp

nPnj

ii

ij

innnj

j

State Probabilities:

Page 10: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

kpiX,....,iXP

nPnP

P

P

n

ji,iiknnk

n

jj timeinitial oft independen is π

since stationary is process resulting the

πPMFstateinitiali.e.0 if

stationary benot may process the,0 If :Note

1

00 10

ππ

π

π

Not all Markov chains settle in to steady state , e.g. Binomial Counting

1courseof

Clearly

chain. theofSSP PMF state stationary thecalled is][ PMF The

i

ij

21

i

iij

P

p

....

ππ

π

In some cases, the probabilities pj(n) approach a fixed point as n

Steady State Probabilities:

Page 11: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Classification of States ( Discrete time Markov Chains)

* State j is accessible from state i if pij(n) > 0 for some n 0

* States i and j communicate if i is accessible from j and j from i . This is denoted i j .

* i i

* i j and j k i k . Class: States i and j belong to the same class if i j .

If S set of states, then for any Markov chain

If a Markov chain has only one class, it is called irreducible.

lk

ClassClassClassS lkk

k

If

and

Page 12: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Recurrence Properties:

Let fi P ( Xn ever returns to i | X0 = i )

If fi = 1 , i is termed recurrent . If fi < 1 , i is termed transient .

If i is recurrent , X0 = i infinite # of returns to i . If i is transient , X0 = i finite # of returns to i .

Page 13: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

1

1

1

10

10

01

0

ifftransient

iffrecurrentis

toreturns ofnumber Then

else0

If1 Define

nii

njj

nii

nn

nni

nni

k

ki

npisi

npi

np

iX|iXPiX|XIE

iX|XIE

iX|jE

iXXI

Page 14: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

• If i is recurrent and i Classk , then all j Classk are recurrent. If i is transient , all j are transient , i.e. recurrence and transience are class properties.

States of an irreducible Markov chain are either all transient or all recurrent.

• If # of states < , all states cannot be transient All states in a finite-state irreducible Markov Chain are recurrent .

Periodicity:

If for state i , pii(n) = 0 except when n is a multiple of d, where d is a largest such integer , i is said to have a period d. Period is also a class property.

An irreducible Markov chain is aperiodic, if all of its states have period 1.

Page 15: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

1

2 3

5 4

Class 1(Transient)Class 2(Recurrent)

1 2 3

4 5

Irreducible Markov Chain

0 1 2 3 k k+1

Non- Irreducible Markov Chain

Page 16: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

1

2

3

11

1

A typical periodic M C

1 2 30

Recurrence times for 0,1 = { 2,4,6,8, . . . . }

Recurrence times for 2,3 = { 4,6,8, . . . . }

period = 2

1

1

1

1/21/2

Page 17: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Let X0 = i where i is a recurrent state .

Define Ti (k) interval between (k-1) th and k th returns to i .

ii

k

qi

TEi,k

qT

k

1intimefrac.As

returnsk after iin spent timeoffraction

1

i Positive Recurrent: E(Ti) < , i > 0

i Null Recurrent: E(Ti) = , i = 0 (e.g. all states in a random walk with p = 0.5)

i is Ergodic if it is positive recurrent, aperiodic.

Ergodic Markov Chain: An irreducible, aperiodic, positive recurrent MC.

(by the law of large numbers)

where i is the long-term fraction of time spent in state i.

Page 18: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Limiting Probabilities:

j ‘s satisfy the rule for stationary state PMF :

jj

iijij jp

1

A

This is because

long-term proportion of time in which j follows i

= long-term proportion of time in i P( i j) = i pij

and

long-term proportion of time in j

= i (long-term proportion of time in which j follows i)

= i i pij j

Page 19: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Theorem: For an Irreducible, aperiodic and positive recurrent Markov Chain

jnplim jijn

Where j is a unique non-negative solution of A.

i.e. Steady state prob of j = stationary state pmf = Long-term fraction of time in j

Ergodicity.

Page 20: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Continuous-Time Markov Chains

Transition Probabilities:

P( X(s+t) = j | X(s) = i ) = P( X(t) = j | X(0) = i ) pij (t) t 0 i.e. the transition probabilities depend only on t, not on s (time-invariant transition probabilities homogenous)

P(t) = TPM = matrix of pij (t) i,j

Clearly P (0) = I (identity matrix)

Page 21: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Ex 8.12 : Poisson Process

........tee

......!ettee

...............!ettee

tP

ije!ij

t

tp

tijPtp

tt

ttt

ttt

tij

ij,

ij

00

20

2

secondsinevents

2

2

0

Can only transition from j to j+1 or remain in j because is small for 2 transitions.

Page 22: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

......

....

.....

....

P

tetas

....!

t

!

tte

t

t

100

010

001

intervalsmallveryafor

10

321Since

32

Can only transition from j to j+1 or remain in j because is small for 2 transitions.

Page 23: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

State Occupancy Times:

.different for different bemay that Note

timeoccupancy StateMean 1E

r.v.] memorylessonly theis r.v. exp Since[ lexponentia

Property] Markov theFrom[

statein spent time

i

i

i

i

etTP

T

~T

tTPsT|tsTP

iT

i

ti

i

i

iii

i

Page 24: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Embedded Markov Chains :

Consider a continuous-time Markov Chain with the state Occupancy times Ti and

ti

ietTP

The corresponding Markov chain is a discrete time MC with the same states as the original MC. Each time the state i is entered, a Ti ~ exponential (i ) is chosen. After Ti is elapsed , a new state is transitioned to with probability qij , which depend on the original MC as:

else

jip

p

q ikik

ij

ij

0

if

This is very useful in generating Markov chains in simulations.

Page 25: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Transition Rates:

ijPq

i

op

oTPp

o

....eTP

ij

ii

iii

ii

leavingupon enters processLet

leaves process heat which t rate the

1or

1

smallisif1

21

i

i

i

i

22

ii

i

iiiiiii

ij

jiFor

ijijijiij

ijiijiiij

opso

ij

opq

oqqpp

ng)rate(leavi(staying)rate

defineand

from enters process heat which t rate the

define

1Then

Page 26: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

State Probabilities:

tpptpp

tptpptpptptp

tp

tpptpp

tpp

itXPitX|jtXPjtXPtp

jtXPtp

jjjiji

ij

jjjjiji

ijjj

j

jjjiji

ij

ii

ij

ij

j

1

sidesboth on Subtract

ii

ii

ijii

plim

olim

plim

1

definitionby0Since

0

00

Page 27: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

...,,,j,itptpdt

d

tptptp

p

p

tpptpp

tptp

iiijj

iiij

jj

jjij

ijij

jjji

jiij

jj

210

lim

1lim

limbut

1limlim

0 lim takeand by Divide

0

0

0

00

This is a system of Chapman – Kolmogorov Equations. These are solved for each pj(t) using the initial PMF p(0)= [p0(0) p1(0) p2(0) . . . . ]

Note: If we start with pi(0) = 1 , pj(0) = 0 j i , pj(t) pij(t) C-K equations can be used to find TPM P(t)

Page 28: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Steady State Probabilities:

If pj(t) pj j as t , the system reaches equilibrium (SS) . Then

Equation. Balance Global

left is at which ratebut

Since

0

jiiij

jijij

jijij

jiiijjj

iii

iiij

pp

j

jpp

jp

Solve these equations j to obtain pj ‘s - equilibrium PMF. The GBE states that, at equilibrium , rate of probability flow out of j (LHS) = rate of probability flow in to j (RHS)

Process Stationary ,0if,Then

Define 10

tptp

....pp

iipp

p

Page 29: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Example: M/M/1 queue ( Poisson arrivals/ exp-time arrivals / 1 server) arrival rate = , service rate = i,i+1 = i =0,1,2, .. .… ( i customers i+ 1 customers ) i,i-1 = i =1,2,3, .. .… ( i customers i - 1 customers )

0 1 2 j j+1

exist tostayesteady for

converges...1if1

1if1

...1and

1j00)1and)3

3constant

1j holds 2) since ,constant is)2

2

1,...2,1for

0for:GBE

2

0

02

11

110

10

1

11

in to flowy probabilit of rate ofout flowy probabilit of rate

11

10

jj

jjjjj

jj

jj

jjjj

jj

jjj

p

p

ppppp

pppp

pp

pp

pppp

jppp

jpp

Page 30: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Ex: Birth- Death processes

0 1 2 j j+1

0 1 j

3

2

1 2 3 j+1

j = birth rate at state j j = death rate at state j

0constant1

2

0whenand

10

get we, 1or 1 only totion can transi process the, from since

GBE

11

1100

1111

111111

jKpp)

pp

j

jpppp

ppp

j-jj

pp:

jjjj

jjjjjjjj

jj,jjj,jj,jj,jj

iji

jiji

jij

Page 31: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

exist.not does PMF stationary a converge,not does If

converges series theif1

1Now

1anddefine

00

0

k

0

0

0

00

0

01

01

011

111

11

1100

k

kk

jj

jj

jj

jj

j

iij

j

iijjj

jjjj

jj

jjjj

R

R

Rp

Rp

pRp

RrR

prpr....rrp

prpp

jpp

Kpp

Page 32: Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

Theorem: Given CT MC X(t) with associated embedded MC [qij ] with SS PMF j , if [qij ] is irreducible and pos recurrent , the long term fraction of time spent by X(t) is state i is

jji

iiiP

Which is also the unique solution to the GBE’s.


Recommended