+ All Categories
Home > Documents > Probability and Stochastic Processes

Probability and Stochastic Processes

Date post: 04-Jan-2016
Category:
Upload: cameron-bullock
View: 119 times
Download: 0 times
Share this document with a friend
Description:
Probability and Stochastic Processes. References: Wolff, Stochastic Modeling and the Theory of Queues , Chapter 1 Altiok, Performance Analysis of Manufacturing Systems , Chapter 2. Random Variables. Discrete vs. Continuous Cumulative distribution function Density function - PowerPoint PPT Presentation
30
Chapter 0 1 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis of Manufacturing Systems, Chapter 2
Transcript
Page 1: Probability and Stochastic Processes

Chapter 0 1

Probability and Stochastic Processes

References:

Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1

Altiok, Performance Analysis of Manufacturing Systems, Chapter 2

Page 2: Probability and Stochastic Processes

Chapter 0 2

Random Variables

• Discrete vs. Continuous

• Cumulative distribution function

• Density function

• Probability distribution (mass) function

• Joint distributions

• Conditional distributions

• Functions of random variables

• Moments of random variables

• Transforms and generating functions

Page 3: Probability and Stochastic Processes

Chapter 0 3

Functions of Random Variables

• Often we’re interested in some combination of r.v.’s– Sum of the first k interarrival times = time of the kth arrival

– Minimum of service times for parallel servers = time until next departure

• If X = min(Y, Z) then – therefore,

– and if Y and Z are independent,

• If X = max(Y, Z) then

• If X = Y + Z , its distribution is the convolution of the distributions of Y and Z. Find it by conditioning.

Pr 1 Pr ,X x Y x Z x if and only if and X x Y x Z x

Pr 1 Pr PrX x Y x Z x Pr Pr ,X x Y x Z x

Page 4: Probability and Stochastic Processes

Chapter 0 4

Conditioning (Wolff)• Frequently, the conditional distribution of Y given X is

easier to find than the distribution of Y alone. If so, evaluate probabilities about Y using the conditional distribution along with the marginal distribution of X:

– Example: Draw 2 balls simultaneously from urn containing four balls numbered 1, 2, 3 and 4. X = number on the first ball, Y = number on the second ball, Z = XY. What is Pr(Z > 5)?

– Key: Maybe easier to evaluate Z if X is known

Pr Pr XY A Y A X x f x dx

4

1

Pr 5 Pr 5 Prx

Z Z X x X x

Page 5: Probability and Stochastic Processes

Chapter 0 5

Convolution

• Let X = Y+Z.

• If Y and Z are independent,

– Example: Poisson

Pr Pr Pr

Prx z

X Z ZY Z

X x Z z Y Z x Z z Y x z Z z

F x Y x z Z z f z dz f y Z z f z dydz

x z

X Y ZF x f y f z dydz

Page 6: Probability and Stochastic Processes

Chapter 0 6

Moments of Random Variables

• Expectation = “average”

• Variance = “volatility”

• Standard Deviation

• Coefficient of Variation

or PrXE X xf x dx x X x

2 22Var X E X E X E X E X Var X

2 2

Var

(s.c.v.) Var

X

X

Cv X E X

Cv X E X

or PrXE g X g x f x dx g x X x

Page 7: Probability and Stochastic Processes

Chapter 0 7

Linear Functions of Random Variables

• Covariance

• Correlation

If X and Y are independent then

2Var Var

Var Var Var 2Cov ,

E X Y E X E Y

E aX aE X

aX a X

X Y X Y X Y

Cov ,X Y E X E X Y E Y E XY E X E Y

Cov ,

Var VarXY

X Y

X Y

Cov , 0XYX Y

Page 8: Probability and Stochastic Processes

Chapter 0 8

Transforms and Generating Functions• Moment-generating function

• Laplace-Stieltjes transform

• Generating function (z – transform)Let N be a nonnegative integer random variable;

0

, 0

1

sX sxX

k sXkk

k

s

E e e f x dx s

d E eE X

ds

*

0

1

X xX

k Xkk

k

s

M E e e f x dx

d E eE X

d

Pr , 0,1,2,...nP N n n

0

22

2

1 1

, 1.

,

nnn

z z

G z P z z

dG z d G zE N E N E N

dz dz

Page 9: Probability and Stochastic Processes

Chapter 0 9

Special Distributions

• Discrete– Bernoulli

– Binomial

– Geometric

– Poisson

• Continuous– Uniform

– Exponential

– Gamma

– Normal

Page 10: Probability and Stochastic Processes

Chapter 0 10

Bernoulli Distribution

“Single coin flip” p = Pr(success)

N = 1 if success, 0 otherwise , 1Pr

1 , 0

p nN n

p n

2

*

Var 1

1

1

N

E N p

N p p

pCv

p

M p pe

Page 11: Probability and Stochastic Processes

Chapter 0 11

Binomial Distribution

“n independent coin flips” p = Pr(success)

N = # of successes Pr 1 , 0,1,...,n kkn

N k p p k nk

2

*

Var 1

1

1

N

n

E N np

N np p

pCv

np

M p pe

Page 12: Probability and Stochastic Processes

Chapter 0 12

Geometric Distribution

“independent coin flips” p = Pr(success)

N = # of flips until (including) first success

Memoryless property: Have flipped k times without success;

1Pr 1 , 1,2,...

kN k p p k

2

2

1

Var 1

1N

E N p

N p p

Cv p

1Pr 1 (still geometric)

nN k n N k p p

Page 13: Probability and Stochastic Processes

Chapter 0 13

z-Transform for Geometric Distribution

Given Pn = (1-p)n-1p, n = 1, 2, …., find

Then,

1

nnn

G z P z

11

1 1 0

0

1 1 1

1, using geometric series for 1

1 1 1

n nn n

n n n

n

n

G z p pz pz p z pz p z

pza a

p z a

2

1 1

22 2

2 2 2

1

222

1

1

2 1 2, so and

1Var

z z

z

dG z pE N

dz pp pz

d G z p pE N E N E N

dz p p

pN E N E N

p

Page 14: Probability and Stochastic Processes

Chapter 0 14

Poisson Distribution

“Occurrence of rare events” = average rate of occurrence per period;

N = # of events in an arbitrary period

Pr , 0,1,2,...!

keN k k

k

2

Var

1N

E N

N

Cv

Page 15: Probability and Stochastic Processes

Chapter 0 15

Uniform Distribution

X is equally likely to fall anywhere within interval (a,b)

1,Xf x a x b

b a

2

2

22

2

Var12

3X

a bE X

b aX

b aCv

b a

a b

Page 16: Probability and Stochastic Processes

Chapter 0 16

Exponential DistributionX is nonnegative and it is most likely to fall near 0

Also memoryless; more on this later…

, 0xXf x e x

2

2

1 , 0

1

1Var

1

xX

X

F x e x

E X

X

Cv

Page 17: Probability and Stochastic Processes

Chapter 0 17

Gamma DistributionX is nonnegative, by varying parameter b get a variety of shapes

When b is an integer, k, this is called the Erlang-k distribution, and Erlang-1 is same as exponential.

1

1

0, 0, where for 0

b b xb x

X

x ef x x b x e dx b

b

2

2

Var

1X

bE X

bX

Cvb

1 !k k

Page 18: Probability and Stochastic Processes

Chapter 0 18

Normal DistributionX follows a “bell-shaped” density function

From the central limit theorem, the distribution of the sum of independent and identically distributed random variables approaches a normal distribution as the number of summed random variables goes to infinity.

2 221,

2x

Xf x e x

2Var

E X

X

Page 19: Probability and Stochastic Processes

Chapter 0 19

m.g.f.’s of Exponential and Erlang

If X is exponential and Y is Erlang-k,

Fact: The mgf of a sum of independent r.v.’s equals the product of the individual mgf’s.

Therefore, the sum of k independent exponential r.v.’s (with the same rate ) follows an Erlang-k distribution.

* * and k

X YM M

Page 20: Probability and Stochastic Processes

Chapter 0 20

Stochastic Processes

• Poissson process

• Markov chains

• Regenerative processes

• Residual life

• Applications– Machine repair model

– M/M/1 queue

– Inventory

Page 21: Probability and Stochastic Processes

Chapter 0 21

Stochastic Processes

Set of random variables, or observations of the same random variable over time:

Xt may be either discrete-valued or continuous-valued.

A counting process is a discrete-valued, continuous-parameter stochastic process that increases by one each time some event occurs. The value of the process at time t is the number of events that have occurred up to (and including) time t.

, 0 (continuous-parameter) or

, 0,1,... (discrete-parameter)

t

n

X t

X n

Page 22: Probability and Stochastic Processes

Chapter 0 22

Poisson Process

Let be a stochastic process where X(t) is the number of events (arrivals) up to time t. Assume X(0)=0 and

(i) Pr(arrival occurs between t and t+t) =

where o(t) is some quantity such that

(ii) Pr(more than one arrival between t and t+t) = o(t)

(iii) If t < u < v < w, then X(w) – X(v) is independent of X(u) – X(t).

Let pn(t) = P(n arrivals occur during the interval (0,t). Then …

, 0X t t

,t o t 0lim / 0t o t t

, 0

!

nt

n

e tp t n

n

Page 23: Probability and Stochastic Processes

Chapter 0 23

Poisson Process and Exponential Dist’n

Let T be the time between arrivals. Pr(T > t) = Pr(there are no arrivals in (0,t) = p0(t) =

Therefore,

that is, the time between arrivals follows an exponential distribution with parameter = the arrival rate.

The converse is also true; if interarrival times are exponential, then the number of arrivals up to time t follows a Poisson distribution with mean and variance equal to t.

Pr 1 , 0, and

, 0

tT

tT

F t T t e t

f t e t

te

Page 24: Probability and Stochastic Processes

Chapter 0 24

When are Poisson arrivals reasonable?

1. The Poisson distribution can be seen as a limit of the binomial distribution, as n , p0 with constant =np.

- many potential customers deciding independently about arriving (arrival = “success”),

- each has small probability of arriving in any particular time interval

2. Conditions given above: probability of arrival in a small interval is approximately proportional to the length of the interval – no bulk arrivals

3. Amount of time since last arrival gives no indication of amount of time until the next arrival (exponential – memoryless)

Page 25: Probability and Stochastic Processes

Chapter 0 25

More Exponential Distribution Facts

1. Suppose T1 and T2 are independent with

Then

2. Suppose (T1, T2, …, Tn ) are independent with

Let Y = min(T1, T2, …, Tn ) . Then

3. Suppose (T1, T2, …, Tk ) are independent with

Let W= T1 + T2 + … + Tk . Then W has an Erlang-k distribution with density function

1 1 2 2exp , expT T

11 2

1 2

Pr T T

expi iT

1 2exp ... nY expi iT

1

2

, 0 with1 !

E and

Var

k

wW

wf w e w

k

kW

kW

Page 26: Probability and Stochastic Processes

Chapter 0 26

Continuous Time Markov Chains

A stochastic process with possible values (state space) S = {0, 1, 2, …} is a CTMC if

“The future is independent of the past given the present”

Define

Then

, 0X t t

Pr , PrX u t j X s s u X u t j X u

Pr (note: indep. of )ijp t X u t j X u i u

0 1, 1ij ijj

p t p t

Page 27: Probability and Stochastic Processes

Chapter 0 27

CTMC Another Way

1. Each time X(t) enters state j, the sojourn time is exponentially distributed with mean 1/qj

2. When the process leaves state i, it goes to state j i with probability pij, where

Let

Then

, where 0ijt p t P P I

0, 0 1, 1ii ij ijj

p p p

Pr 0j ij ii

t X t j p t

Page 28: Probability and Stochastic Processes

Chapter 0 28

CTMC Infinitesimal Generator

The time it takes the process to go from state i to state j

Then qij is the rate of transition from state i to state j,

The infinitesimal generator is

0 01 02 0 01 0 02

10 1 12 1 10 1 1 12

20 21 2 2 20 2 21 2

oq q q q q p q p

q q q q p q q p

q q q q p q p q

Q

i ijj

q q expij ijT q

Page 29: Probability and Stochastic Processes

Chapter 0 29

Long Run (Steady State) Probabilities

Let

• Under certain conditions these limiting probabilities can be shown to exist and are independent of the starting state;

• They represent the long run proportions of time that the process spends in each state,

• Also the steady-state probabilities that the process will be found in each state.

Then

or, equivalently, for all 0,1,2,...

rate out of = rate into

j j i ij ji j

q q p j

j j

limt ij jp t

with 1ii Q 0

Page 30: Probability and Stochastic Processes

Chapter 0 30

Phase-Type Distributions

• Erlang distribution

• Hyperexponential distribution

• Coxian (mixture of generalized Erlang) distributions


Recommended