+ All Categories
Home > Documents > Markov Chains

Markov Chains

Date post: 13-Nov-2014
Category:
Upload: creativechand
View: 64 times
Download: 7 times
Share this document with a friend
Description:
prof c.s.p.rao,nit waranagal
Popular Tags:
91
Markov Models Prof C S P Rao Dept. of Mechanical Engg N I T, Warangal
Transcript
Page 1: Markov Chains

Markov Models

Prof C S P RaoDept. of Mechanical Engg

N I T, Warangal

Page 2: Markov Chains

2

Who was Markov? Andrei A. Markov graduated from Saint

Petersburg University in 1878 and subsequently became a professor there.

His early work dealt mainly in number theory and analysis, etc.

Markov is particularly remembered for his study of Markov chains.

Page 3: Markov Chains

3

Markov Process

Markov process is a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state.

Markov chain is a discrete-time stochastic process with the Markov property

Page 4: Markov Chains

4

Terminology A Markov chain is irreducible if every state

can be reached from every other state. Each state of a Markov chain is either

transient or recurrent. A state is called absorbing, if once the

chain reaches that state, it stays there forever.

A Markov chain is acyclic, if once it leaves any state, it never returns to that state.

Page 5: Markov Chains

5

Markov Property Many systems have the property that,

given the present state, the past states have no influence on the future. This property is called Markov property.

We can say Markov process is a process or simulation that satisfies Markov property.

Page 6: Markov Chains

6

Markov Chain

Let {Xt : t is in T} be a stochastic process with discrete-state space S and discrete-time space T satisfying

 

P(Xn+1 = j|Xn = i, Xn-1 = in-1, · · ·,X0 = i0)

= P(Xn+1 = j|Xn = i) 

for any set of state i0, i1, · · · , in-1, i, j in S and n ≥ 0 is called a Markov Chain.

Page 7: Markov Chains

7

Markov Model

Sometimes Markov Model restricts attention to Markov chains with stationary transition probabilities. But some people tend to avoid this usage for sake of confusion.

Markov Model is also used to refer to all Markov processes that satisfying Markov Property.

Page 8: Markov Chains

8

Components of Stochastic Processes

The state space of a stochastic process is the set of all values that the Xt’s can take.

(we will be concerned with stochastic processes with a finite # of

states )Time: t = 0, 1, 2, . . .

State: m-dimensional vector, s = (s1, s2, . . . , sm )

or s = (s1, s2, . . . , sm ) or (s0, s1, . . . , sm-1)

Sequence Xt, Xt takes one of m values, so Xt s.

Page 9: Markov Chains

9

A stochastic process { Xt } is called a Markov Chain if

Pr{ Xt+1 = j | X0 = k0, . . . , Xt-1 = kt-1, Xt = i }

= Pr{ Xt+1 = j | Xt = i } transition probabilities

for every i, j, k0, . . . , kt-1 and for every t.

The future behavior of the system depends only on the

current state i and not on any of the previous states.

Markov Chain Definition

Page 10: Markov Chains

10

Discrete-Time Markov Chains

Page 11: Markov Chains

11

A stochastic process { Xn } where n N = { 0, 1, 2, . . . } is called a discrete-time Markov chain if

Pr{ Xn+1 = j | X0 = k0, . . . , Xn-1 = kn-1, Xn = i }

= Pr{ Xn+1 = j | Xn = i } transition probabilities

for every i, j, k0, . . . , kn-1 and for every n. The future behavior of the system depends only on the

current state i and not on any of the previous states.

Discrete-Time Markov Chain

Page 12: Markov Chains

12

Pr{ Xn+1 = j | Xn = i } = Pr{ X1 = j |X0 = i } for all n

(They don’t change over time)

We will only consider stationary Markov chains.

The one-step transition matrix for a Markov chain

with states S = { 0, 1, 2 } is

where pij = Pr{ X1 = j | X0 = i }

Stationary Transition Probabilities

00 01 02

10 11 12

20 21 22

p p p

p p p

p p p

P

Page 13: Markov Chains

13

Example

Consider a manufacturing facility comprising a single machine that is prone to failures.

At any instant of time the machine is either Working properly (state ‘0’) or Undergoing repair (state ‘1’)Consider that the state of the machine is

being examined once every hourLet ‘a’ be the probability that the machine

fails in a given hour‘b’ the probability that the failed machine

gets repaired in a given hour

Page 14: Markov Chains

More specifically, ‘a’ is the probability that the machine is in the failed condition by the next observation given that it is working in the current observation

Similarly, ‘b’ is the probability that the machine gets repaired by the next observation given that it is in the failed state in the current observation.

Assume that these probabilities remain the same immaterial of the history of the working of the machine.

Also we assume that these probabilities are the same at any observation epoch.

Page 15: Markov Chains

Formulation of the systemThe above system can be formulated as a

homogeneous DTMC

{Xn S : n N} where S = {0,1} and the time instants t0, t1, t2 …. Corresponds to 0, 1h, 2h,….respectively.

The TPM is given by

10 , 1

1

a aP a b

b b

Page 16: Markov Chains

16

DTMC models for a single machine system

Page 17: Markov Chains

17

Example

Consider an NC machine processing parts that may belong to one of three part types.

Assume that raw parts of each type are always available and wait in separate queues.

We consider the system evaluation as constituting a homogeneous Markov chain, with state space given byS = {1,2,3} where each state corresponds to the type of part getting processed.

The observation epochs t0, t1, t2,… are given by the time instants at which the ith part (i =1,2,3….) is taken up for processing.

Page 18: Markov Chains

18

An NC machine processing three types of parts

Page 19: Markov Chains

Suppose that the NC machines processes the parts in a cyclic way i.e. type 1 part followed by type 2, followed by type 3 part and the same sequence is repeated again and again.

The state transition diagram is shown below

The TPM of the system is

0 1 0

0 0 1

1 0 0

P

Page 20: Markov Chains

Suppose that the type of the next part to be processed is selected probabilistically: part type i is chosen next with probability qi = (i = 1,2,3…), where 0 <qi <1, q1+q2+q3 = 1.

In this case, the state transition diagram and TPM are shown in figure below

1 2 3

1 2 3

1 2 3

q q q

P q q q

q q q

Page 21: Markov Chains

21

Example 3

Consider the queuing network shown in figure below.

Closed central server queuing network model

After transportation by the AGV, the part leaves the system with probability q0 or undergoes one more operation on one of the machines with probabilities q1, q2, …. Qm.

Page 22: Markov Chains

Let there be a single job in this system and let us look at the progress of this job in the system.

It is easy to note that the job can be in (m+1) states.

0 (if getting serviced by AGV)

i (if getting served by Mi, i=1,2,…m)

The flow of the job can be described by a DTMC model.

The TPM of this model is 0 1 2 ....

1 0 0 ....0

1 0 0 ....0

.......................

1 0 0 ....0

mq q q q

P

Page 23: Markov Chains

23

DTMC model for central server network

Page 24: Markov Chains

At time zero, I have X0 = $2, and each day I make a $1 bet.I win with probability p and lose with probability 1– p. I’ll quit if I ever obtain $4 or if I lose all my money.

An Example Problem -- Gambler’s Ruin

if Xt = 4 then Xt+1 = Xt+2 = • • • = 4

3 with probability p1 with probability 1 – p

So, X1 =

{

The possible values of Xt is S = { 0, 1, 2, 3, 4 }

Xt = amount of money I have after the bet on day t.

if Xt = 0 then Xt+1 = Xt+2 = • • • = 0.

Page 25: Markov Chains

25

If the state space S = { 0, 1, . . . , m–1} then we have

j pij = 1 i and pij 0 i, j

(we must (each transition go somewhere) has prob 0)

Property of Transition Matrix

Stationary Property assumes that these values does not change with time

Page 26: Markov Chains

Transition Matrix of the Gambler’s problem

0 1 2 3 4

0 1 0 0 0 0

1 1-p 0 p 0 0

2 0 1-p 0 p 0

3 0 0 1-p 0 p

4 0 0 0 0 1

At time zero I have X0 = $2, and each day I make a $1 bet.I win with probability p and lose with probability 1– p. I’ll quit if I ever obtain $4 or if I lose all my money.

Xt = amount of money I have after the bet on day t.

Transition Matrix of Gambler’s Ruin Problem

Page 27: Markov Chains

27

The state-transition diagram of Gambler’s problem

11

1 32 4

1 - p1 - p

1 2 3 40p pp1 - p

State Transition Diagram

State Transition Diagram Node for each state,

Arc from node i to node j if pij > 0

Notice nodes 0 and 4 are “trapping” node

Page 28: Markov Chains

28

Printer Repair Problem

• Two printers are used for Russ Center.

• When both working in morning, 30% chance that one

will fail by evening and a 10% chance that both will fail.

• If only one printer is serving at the beginning of the

day, there is a 20% chance that it will fail by the close

of business.

• If neither is working in the morning, the office sends all

work to a printing service.

• If failed during the day, a printer can be repaired

overnight and be returned earlier the next day

Page 29: Markov Chains

States for Computer Repair Example

Index s State definitions

0 s0 = (0)No printers have failed. The office starts the day with both computers functioning properly.

1 s1 = (1)One printer has failed. The office starts the day with one working computer and the other in the shop until the next morning.

2 s2 = (2)Both computers have failed. All work must be sent out for the day.

Page 30: Markov Chains

Events and Probabilities for Computer Repair Example

IndexCurrent

stateEvents Probabili

tyNext state

0 s0 = (0) No printer fails. 0.6 s = (0)

    One printer fails. 0.3 s = (1)

    Both printer fail. 0.1 s = (2)

1 s1 = (1) printer no fail and the other is returned. 0.8

s = (0)

    printer no fail and the other is returned. 0.2

s = (1)

2 s2 = (2) Both printer returned. 1.0 s = (0)

Page 31: Markov Chains

State-Transition Matrix and Network

The major properties of a Markov chain can be described by the m m matrix:

P = (pij).

For printer repair example

001

02.08.0

1.03.06.0

P

State-Transition Network:

Node for each state,

Arc from node i to node j if pij > 0.

2

0

1

(1)

(0.6)

(0.3)(0.1)

(0.2)

(0.8)

For printer repair example

State-Transition Matrix

Page 32: Markov Chains

Market Share/Brand Switching Problem

Brand

(j) 1 2 3 Total

(i)

1 90 7 3 100

2 5 205 40 250

3 30 18 102 150

Total 125 230 145 500

Market Share Problem:

How to model the problem as a stochastic process ?

You are given the original market of three companies. The following table gives the number of consumers that switches from brand i to brand j in two consecutive weeks

Page 33: Markov Chains

33

Empirical Transition Probabilities for Brand Switching, pij

Brand (j) 1 2 3

(i)

1 90/100 = 0.90

7/100 = 0.07 3/100 = 0.03

2 5/250 = 0.02 205/250 = 0.82

40/250 = 0.16

3 30/150 = 0.20

18/150 = 0.12 102/150 = 0.68

Transition Matrix

Page 34: Markov Chains

34

Assumption Revisited• Markov Property

Pr{ Xt+1 = j | Xt = i } = Pr{ X1 = j | X0 = i } for all t

(They don’t change over time)

We will only consider stationary Markov chains.

• Stationary Property

Pr{ Xt+1 = j | X0 = k0, . . . , Xt-1 = kt-1, Xt = i }

= Pr{ Xt+1 = j | Xt = i } transition probabilities

for every i, j, k0, . . . , kt-1 and for every t.

Page 35: Markov Chains

Transform a Process to a Markov Chain

Sometimes a non-Markovian stochastic process can be transformed into a Markov chain by expanding the state space.

Example: Suppose that the chance of rain tomorrow depends on the weather conditions for the previous two days (yesterday and today).

Specifically,P{ rain tomorrowrain last 2 days (RR)}= .7P{ rain tomorrowrain today but not yesterday (NR)}= .5P{ rain tomorrowrain yesterday but not today (RN)}= .4P{ rain tomorrowno rain in last 2 days (NN)} = .2Does the Markovian Property Hold ??

Page 36: Markov Chains

36

The Weather Prediction Problem

How to model this problem as a Markovian Process ??

The state space:

0 ( RR)

.7

0

.3

0P =

1 (NR)

.5

0

.5

0

2 (RN)

0

.4

0

.6

3 (NN)

0

.2

0

.8

0 (RR) 1 (NR) 2(RN) 3(NN)

0 (RR) 1 (NR) 2(RN) 3(NN)

The transition matrix:

This is a Discrete Time Markov Process

Page 37: Markov Chains

Choosing Balls from an Urn

If the chosen ball is unpainted and the coin comes up heads, we paint the chosen unpainted ball red

An urn contains two unpainted balls at present. We choose a ball at random and flip a coin.

If the chosen ball is unpainted and the coin comes up tails, we paint the chosen unpainted ball blue.

If the ball has already been painted, then (whether heads or tails has been tossed), we change the color of the ball (from red to blue or from blue to red)

Model this problem as a Discrete Time Markov Chain(represent it using state diagram & transition matrix)

Page 38: Markov Chains

38

An insurance company charges customers annual premiums based on their accident history in the following fashion:

No accident in last 2 years: $250 annual premium

Accidents in each of last 2 years: $800 annual premium

Accident in only 1 of last 2 years: $400 annual premium

Historical statistics:

1. If a customer had an accident last year then they have a 10% chance of having one this year;

2. If they had no accident last year then they have a 3% chance of having one this year.

Insurance Company Example

Page 39: Markov Chains

39

Find the steady-state probability and the long-run average annual premium paid by the customer.

Solution approach: Construct a Markov chain with four states: (N, N), (N, Y), (Y, N), (Y,Y) where these indicate (accident last year, accident this year).

(N, N) (N, Y) (Y, N) (Y, Y)

(N, N) .97 .03 0 0

(N, Y) 0 0 .90 .10

(Y, N) .97 .03 0 0

(Y, Y) 0 0 .90 .10

P =

Page 40: Markov Chains

40

Y, Y.97

.03

.97

.90.03

.10

.90

.10Y, NN, YN, N

State-Transition Network for Insurance Company

This is an ergodic Markov chain:

All states communicate (irreducible);

Each state is recurrent (you will return, eventually);

Each state is aperiodic.

Page 41: Markov Chains

Solving the steady – state equations:

(N,N) = 0.97 (N,N) + 0.97 (Y,N)

(N,Y) = 0.03 (N,N) + 0.03 (Y,N)

(Y,N) = 0.9 (N,Y) + 0.9 (Y,Y)

(N,N) + (N,Y)+(Y,N) + (Y,Y) = 1

Solution:

(N,N) = 0.939, (N,Y) = 0.029, (Y,N) = 0.029, (Y,Y) = 0.003

& the long-run average annual premium is

0.939*250 + 0.029*400 + 0.029*400 + 0.003*800 = 260.5

Page 42: Markov Chains

42

Let ij = expected number of steps to transition from state i to state j

If the probability that we will eventually visit state j given that we start in i is less than one then we will have ij = +.

First Passage Times

For example, in the Gambler’s Ruin problem, 20 = + because there is a positive probability that we will be absorbed in state 4 given that we start in state 2 (and hence visit state 0).

Page 43: Markov Chains

43

If the probability of eventually visiting state j given that we start in i is 1 then the expected number of steps until we first visit j is given by

It will always takeat least one step.

We go from i to r in the first step with probability pir and it takes rj steps from r to j.

Computations for All States Recurrent

ij= 1 + pirrj, for i = 0,1, . . . , m–1rj

For j fixed, we have linear system in m equations and m unknowns mij , i = 0,1, . . . , m–1.

Page 44: Markov Chains

Suppose that we start in state (N,N) and want to find the expected number of years until we have accidents in two consecutive years (Y,Y).

This transition will occur with probability 1, eventually.

First-Passage Analysis for Insurance Company

For convenience number the states

0 1 2 3(N,N) (N,Y) (Y,N) (Y,Y)

Then, 03 = 1 + p00 03 + p01 13 + p0223

13 = 1 + p10 03 + p11 13 + p1223

23 = 1 + p20 03 + p21 13 + p2223

Page 45: Markov Chains

03 = 1 + 0.9703 + 0.0313

13 = 1 + 0.923 23 = 1 + 0.9703 + 0.0313

(N, N) .97 .03 0 0

(N, Y) 0 0 .90 .10

(Y, N) .97 .03 0 0

(Y, Y) 0 0 .90 .10

Using P =

So, on average it takes 343.3 years to transition from (N,N) to (Y,Y).

Solution: 03 = 343.3, 13 = 310, 23 = 343.3

(N, N) (N, Y) (Y, N) (Y, Y)

0123

Page 46: Markov Chains

Continuous Time Markov Chains

Page 47: Markov Chains

47

Continuous Time Markov Chains

A discrete-time stochastic process is a sequence of random variables X0, X1, X2, . . . typically denoted by

{ Xn }.where index T or n takes countable discrete values

A stochastic process is a sequence of random variables indexed by an ordered set T.

Generally, T records time.

Discrete Time Markov chains1) discrete: we do not care how long each step takes2) Markov Property: What happens only depends on the previous state, not on how you get there.

Page 48: Markov Chains

48

Continuous Time Markov Chain

A Continuous Time Markov Chainis a sequence of random variables Xt , t 0

which satisfies the following Markovian Property

At time s, the future behavior of the system in t times Xt+s

depends only on the current state Xs but not on the past Xu, where 0<u<sStationary Property

Pr{ Xt+s = j | Xs } = Pr { X t = j | X0 }

Pr{ Xt+s = j | Xu, 0 u s } = Pr { X t+s = j | Xs }

Page 49: Markov Chains

49

If the state space S = { 0, 1, . . . , m–1}. Suppose we start at state i, and have entered state i for about s minutes, what is the probability that a transaction will not occur in the next t times?

Property of Continuous Time Markov Chain

Pr{ T i > t + s | T i > s, 0 u s } = Pr { T i > t }

According to Markovian Property

Look at Inter-Arrival Time T

There is only one random variable, exponential, that satisfies this memory-less property

Page 50: Markov Chains

50

Continuous Time Markov Chain Definition

Poisson ProcessThe Birth/Death Processes in GeneralThe M/M/s queueBrown Motion

Examples

The M/M/s queue

Illustration

Page 51: Markov Chains

An ATM example (M/M/1/5 Queue)

Consider an ATM located at the foyer of a bank. Only one person can use the machine, so a queue forms when two or more customers are present.

The foyer is limited in size and can hold only five people. When there are more than five people, arriving customers will balk when the foyer is full

Statistics indicates that the average time between arrivals is 30 seconds, or 2 customer per minutes whereas the time for service averages 24 seconds. or 2.5 customers per minutes. Both times follows exponential distribution

Try to help the manager of the bank and answer the following questions.

Page 52: Markov Chains

An ATM example (M/M/1/5 Queue)

a) The proportion of time that the ATM is idle ?

b) The efficiency of the ATM?

d) The proportion of customer that obtain immediate service? e) The proportion of a customer who arrive and find the system is full?

f) The average number of time in the system?

c) The throughput rate of the system?

Customer’s Questions

Manager’s Questions

Page 53: Markov Chains

DTMT Model for the ATM example

What are the State Space of the system

Number of customers in the system: {0}, {1}, {2}, {3}, {4}, {5}

For DTMC, we use Transition Matrix, P (BTW, can you model this problem as a DTMC?)

For CTMC, we use Rate Matrix, R

1

3

23

0 1 2

210 4

3

3

5

3

4

Page 54: Markov Chains

54

Steady State Probability

We will investigate steady-state (not transient) results for CTMC based on the same

Flow Balance Principle(Rate in = Rate out)

Let n = steady-state probability of being in state n.

What is the long term steady state probability?

0 + 1 +2 +3 +4 +5 = 1

Page 55: Markov Chains

55

Flow into 0 1 = 0 flow out of 0

Flow into 1 0 + 2 = ( + )1 flow out of 1

Flow into 2 1 + 3 = ( + )2 flow out of 2

Flow into 5 4 = 5 flow out of n

Balance Equations

3

210 4

5

Page 56: Markov Chains

Rate Matrix & Solution with M/M/1/5

States 0 1 2 3 4 5

0 0 2 0 0 0 0

1 2.5 0 2 0 0 0

2 0 2.5 0 2 0 0

3 0 0 2.5 0 2 0

4 0 0 0 2.5 0 2

5 0 0 0 0 2.5 0

State 0 State 1 State 2 State 3 State 4 State 5

0.271 0.217 0.173 0.139 0.111 0.0888

Solution

Rate Matrix

Page 57: Markov Chains

Solution Analysis

a) The proportion of time that the ATM is idle ? b) The efficiency of the ATM?

d) The proportion of time a customer obtain immediate service?

e) The proportion of a customer find the system is full?

f) The average time in the system?

c) The throughput rate of the system?

Customer’s Questions

Manager’s Questions

0 =27%

10 = 73%

(15) = 1.822

0 =27%

5 =9%

Little’s Law, see queuing

d)What is the average number of customers in the system?

11 +22 +33 +44 +55 = 1.868

Page 58: Markov Chains

Additional Questions

What if we want to add a new ATM machine, what will the system perform? M/M/2/5

What if we want to add a teller them with a service time exponential distributed at 1 minute a customer, what will the system perform? This is not a Standard Queue

What if we want to add two new ATM machines, what will the system perform? M/M/3/5

What if we want to add more spaces so that 8 customer can wait. What will the system perform? M/M/1/8

What if we want to add more spaces so that 12 customer can wait. What will the system perform? M/M/1/8

Page 59: Markov Chains

59

M/M/2/5

3

2 2

210 4

2

5

2

What if we want to add a new ATM machine there, what will the system perform? M/M/2/5What will the state-transition network looks like?

Page 60: Markov Chains

Rate Matrix & Solution with M/M/2/5

States 0 1 2 3 4 5

0 0 2 0 0 0 0

1 2.5 0 2 0 0 0

2 0 5 0 2 0 0

3 0 0 5 0 2 0

4 0 0 0 5 0 2

5 0 0 0 0 5 0

State 0 State 1 State 2 State 3 State 4 State 5

0.431 0.345 0.137 0.055 0.022 0.009

Solution

Rate Matrix

Page 61: Markov Chains

61

M/M/3/5

3

2 3

210 4

3

5

3

What if we want to add two new ATM machines, what will the system perform? M/M/3/5

What will the state-transition network looks like?

Page 62: Markov Chains

Rate Matrix & Solution with M/M/3/5

States 0 1 2 3 4 5

0 0 2 0 0 0 0

1 2.5 0 2 0 0 0

2 0 5 0 2 0 0

3 0 0 7.5 0 2 0

4 0 0 0 7.5 0 2

5 0 0 0 0 7.5 0

State 0 State 1 State 2 State 3 State 4 State 5

0.448 0.359 0.143 0.038 0.010 0.0027

Solution

Rate Matrix

Page 63: Markov Chains

Comparison of different alternativesM/M/1/5 and M/M/2/5 and M/M/3/5

System Measure 1 ATM 2 ATM 3 ATM

Efficiency 0.729 0.396 0.265

Throughput 1.822 1.982 1.995

Average # in queue 1.14 0.1258 0.016

Average time in queue 0.625 0.064 0.008

Proportion of customers who wait 0.64 0.2152 0.0484

Proportion of customers lost 0.0888 0.0088 0.0027

Page 64: Markov Chains

64

The addition of spaces to the foyer

What if we want to add more spaces so that 8 customer can wait. What will the system perform? M/M/1/8

What if we want to add more spaces so that 12 customer can wait. What will the system perform? M/M/1/12

3

210

8

….

3

210

12

….

Page 65: Markov Chains

65

Comparison of different alternativesM/M/1/5 and M/M/2/5 and M/M/3/5

System Measure M/M/1/5 M/M/1/8 M/M/1/12

Efficiency 0.729 0.769 0.7884

Throughput 1.822 1.923 1.971

Average # in queue 1.14 1.84 2.46

Average time in queue 0.625 0.955 1.246

Proportion of customers who wait 0.64 0.73 0.77

Proportion of customers lost 0.0888 0.038 0.015

Page 66: Markov Chains

Adding a Human ServerThe manager decide to add a human teller with a service time exponential distributed at 1 minute a customer

However, when a customer enters into the system, he/she would prefer to go to the human server first if the server is available?

In this case, how would the system perform?

Notice you have to differentiate the two server now

Let us use (HS, MS, # waiting) to represent the system, suppose the foyer can hold at most 5 people

(0,0,0), (0,1,0), (0,1,0), (1,1,0), (1,1,1), (1,1,2), (1,1,3)

Approach

Page 67: Markov Chains

67

Addition of a Human Server

m

1,1,1)

m+ h

1,1,0)(0,0,0)

1,1,3)

1,1,2)

(1,0,0)

(0,1,0)

m = 2.5, ATM service rate

h = 1, human service rate

h

State: (HS, MS, # waiting)

m+ h m+ h

h

m

Page 68: Markov Chains

Addition of Human Server

States0,0,

00,1,

01,0,

01,1,

01,1,

11,1,

21,1,

3

0,0,0 0 0 2 0 0 0 0

0,1,0 2.5 0 0 2 0 0 0

1,0,0 1 0 0 2 0 0 0

1,1,0 0 1 2.5 0 2 0 0

1,1,1 0 0 0 3.5 0 2 0

1,1,2 0 0 0 0 3.5 0 2

1,1,3 0 0 0 0 0 3.5 2

States 0,0,0 0,1,0 1,0,0 1,1,0 1,1,1 1,1,2 1,1,3

Solution

0.214 0.046

0.313

0.205

0.117 0.067 0.038

Solution

Rate Matrix

Page 69: Markov Chains

69

A Queue With Finite Input Sources

A taxi company with a fleet of 6 cabs and a repair shop to handle breakdowns

Assume that taxis are identical and are exponential distributed with breakdown rate of 1/3 per month

The company is thinking of setting up several service bays and the estimate repair time is exponential distributed with a rate of 4 per month

Do a analysis to help the company to figure out how many service bay to set up

Page 70: Markov Chains

70

A Queue With Finite Input Sources

One Bay

Two Bay

4

3

4 4

61/3 51/3 41/3

210

4

31/3

6

4

11/3

….

4

3

24 24

61/3 51/3 41/3

210

24

31/3

6

24

11/3

….

Page 71: Markov Chains

A Queue With Finite Input Sources (1 Bay)

States 0 1 2 3 4 5 6

0 0 2 0 0 0 0 0

1 4 01.66

7 0 0 0 0

2 0 4 01.33

3 0 0 0

3 0 0 4 0 1 0 0

4 0 0 0 4 00.66

7 0

5 0 0 0 0 4 00.33

3

6 0 0 0 0 0 4 0

States 0 1 2 3 4 5 6

Solution

0.556 0.278

0.116

0.039

0.010

0.002 0

Solution

Rate Matrix

Page 72: Markov Chains

A Queue With Finite Input Sources (2 Bay)

States 0 1 2 3 4 5 6

Solution

0.616 0.308

0.064

0.011

0.001 0 0

Solution

States 0 1 2 3 4 5 6

0 0 2 0 0 0 0 0

1 4 01.66

7 0 0 0 0

2 0 8 01.33

3 0 0 0

3 0 0 8 0 1 0 0

4 0 0 0 8 00.66

7 0

5 0 0 0 0 8 00.33

3

6 0 0 0 0 0 8 0

Rate Matrix

Page 73: Markov Chains

73

Economics With These Results

Suppose each taxi on average can bring a revenue of $1200 a day, what would be the expected revenue for each configuration?

If it costs $300 dollars a day to operate a bay, would it be beneficial to the company

One Bay: 72000+ 60001+48002+36003 +24004+12005+06 =$6630 Two Bay: 72000+ 60001+48002+36003 +24004+12005+06 =$6392

Page 74: Markov Chains

74

Probability Transitions: Service with Rework

Consider a machine operation in which there is a 0.4 probability that on completion, a processed part will not be within tolerance.

If the part is unacceptable, the operation is repeated immediately. This is called rework.

Assume that the second try is always successful

What will the system looks like if

a) Arrivals can occur only when the machine is idle

b) Arrivals can occur any time

Page 75: Markov Chains

75

Probability Transitions: Service with Rework

a) Arrivals can occur only when the machine is idle

0.6d1

a 0.4d1

rwwi

0.6d2

b) Arrivals can occur any time

0.6d1 0.6d1

a

0.4d1

0,r

0,w0,i

0.6d2

a

0.4d1

1,r

1,w

0.6d2

a

a

0.4d1

2,r

2,w

a

……

Page 76: Markov Chains

An ATM with a Human ServerConsider an ATM located together with a Human Server at the foyer of a bank. The foyer is limited in size and when there are more than five people, arriving customers will balkStatistics indicates that the average time between arrivals is exponential distributed with an average of 30 seconds, or 2 customer per minute

The service time of human server is exponential distributed with an average of 1 minute per customer.

The service time of the ATM is exponential distributed with an average of 24 seconds. or 2.5 customers per minutes.

It is further assumed that when a customer enters into the system, he/she would prefer to go to the human server first if the server is available.

Page 77: Markov Chains

77

CTMC Model for ATM and Human Server

m

1,1,1)

m+ h

1,1,0)(0,0,0)

1,1,3)

1,1,2)

(1,0,0)

(0,1,0)

m = 2.5, ATM service rate

h = 1, human service rate

h

State: (HS, MS, # waiting)

m+ h m+ h

h

m

Page 78: Markov Chains

CTMC Model for ATM and Human Server

States0,0,

00,1,

01,0,

01,1,

01,1,

11,1,

21,1,

3

0,0,0 0 0 2 0 0 0 0

0,1,0 2.5 0 0 2 0 0 0

1,0,0 1 0 0 2 0 0 0

1,1,0 0 1 2.5 0 2 0 0

1,1,1 0 0 0 3.5 0 2 0

1,1,2 0 0 0 0 3.5 0 2

1,1,3 0 0 0 0 0 3.5 2

States 0,0,0 0,1,0 1,0,0 1,1,01,1,

1 1,1,2 1,1,3

Solution

0.214 0.046

0.313

0.205

0.117

0.067

0.038

Solution

Rate Matrix

Page 79: Markov Chains

79

Birth & Death Process

2 3 4 …a a

1a

0a0 1 2 3

Pure Birth Process; e.g., Hurricanes

Pure Death Process; e.g., Delivery of a truckload of parcels

2 3 4 …d

1d

0d 41 2 3d

Birth-Death Process; M/M/s/k/

2 3 4 …10

d4d1d2 d3

a0a1 a2

a3

will be picked up in Queuing Theory

Page 80: Markov Chains

80

Pure Birth Process – Poisson Process

2 3 4 …a a

1a

0a0 1 2 3

Pure Birth Process – Poisson Process

States 0 1 2 3 … …

0 0 0 0 0 0

1 0 0 0 0 0

2 0 0 0 0 0

3 0 0 0 0 0

… 0 0 0 0 0 … 0 0 0 0 0 0

Poisson Process Rate Matrix

Page 81: Markov Chains

81

Poisson ProcessNO Steady State Probability

The embedded Markov Chain is not ergodic

The number in the system is increasing with time

Transient Probability: number of events within time t

This is a random variable with Poisson distribution.

A general scheme is what we call a counting process,

Exponential Random and Poisson Process

Page 82: Markov Chains

82

Pure Death ProcessPure Death Process

2 3 4 …d

1d

0d 41 2 3d

NO Steady State Probability

The embedded Markov Chain is not ergodic

The number in the system is decreasing with time

Transient Probability: number of events within time t

This again is a random variable with Poisson distribution.

Page 83: Markov Chains

83

An Example

What is the prob. that you will be served in 10 minutes ?

What is the prob. that you will be served in 20 minutes ?

What is the expected waiting time before you are served?

Suppose that you arrive at a single teller bank to find five other customers in the bank. One being served and the other four waiting in line. You join the end of the line. If the service time are all exponential with rate 5 minutes.

What is the prob. that you will be served in 10 minutes ?

What is the prob. that you will be served in 20 minutes ?

What is the expected waiting time before you are served?

Page 84: Markov Chains

84

Assumption Revisited

Markov Property: Inter-arrival has to exponential distributed

Arrival and service time

Steady-State Probability Flow Balance Rate in == Rate OutSolve a set of linear equations

Arrival time : Exponential ?

A large population n, each one has a small percentage p of entering a store, When n is large, p is small, exponential is a good approximation.

Page 85: Markov Chains

85

Assumption Revisited

Service time: Exponential ?

Grocery Store, might still be valid

What if they are not exponential distributed?

It will be much difficult to get analytical results, A lot of times, simulation will have to be used.

Markovian Property does not hold any.

Might not be able to use the rate in = rate out principle

Hair Cut: Might not be an exponential now

Page 86: Markov Chains

86

A Machine Repair Example

A factory contains two major machines which fails independently according to a exponential distribution with a mean time of 10 per hour

The repair of a machine takes an avearage of 8 hours and the repair time is distributed according to a exponential distribution.

Model the problem as a CTMC or a queuing model and give analytic results.

Page 87: Markov Chains

87

= rate at which a single machine breaks down = 1/10 hr

= rate at which machines are repaired =1/8 hr

State of the system = # of broken machines.

State-Transition Diagram

21

0

2

0 2

State-transition diagram:

Page 88: Markov Chains

88

1 = 20

20 + 2 = ( + )1

1 = 2

Balance Equations for Repair Example

Cn = n-1 … 0

n … 1

n = Cn0 and 0 = 1 / n=0, Cn

We can solve these balance equations for 0, 1 and 2, but in this case, we can simply use the formulas that solve general birth-and-death equations:

Page 89: Markov Chains

C1 = C2 = =

and C0 = 1 (by definition). Thus

0 = = 0.258 , 1 = 0 = 0.412

Here, 0 = 2 1 = 1 = 2 = 2 = 0

1 2 =

10

21 22

2

1

1 2

22

2

2

2=

22

2 0 = 0.330

L = 00 + 11 + 22 = 1.072 (avg # machines in system)

Lq = 01 + 12 = 0.33 (avg # waiting for repair)

Page 90: Markov Chains

average arrival rate = nn = 00 + 11 + 22

= (2)0 + 1 = 0.0928

n=s

W = — L =

Wq = —Lq = 10.0928

(1.072)1

0.0928(0.33)

= 11.55 hours = 3.56 hours

Average amount of time that a machine has to wait to be repaired, including the time until the repairman initiates the work.

Average amount of time that a machine has to wait until the repairman initiates the work.

Proportion of time that machine #1 is working= 0 +

12

1 = 0.258 +12 (0.412) = 0.464

Proportion of time repairman is busy = 1 + 2 = 0.742

1

1

Page 91: Markov Chains

Recommended