Random Processes Introduction (2) Professor Ke-Sheng Cheng Department of Bioenvironmental Systems...

Post on 27-Mar-2015

220 views 4 download

Tags:

transcript

Random ProcessesRandom ProcessesIntroduction Introduction (2)(2)

Professor Ke-Sheng ChengProfessor Ke-Sheng ChengDepartment of Bioenvironmental Systems Department of Bioenvironmental Systems EngineeringEngineering

E-mail: rslab@ntu.edu.tw

Stochastic continuity

Stochastic Convergence A random sequence or a discrete-time random

process is a sequence of random variables {X1(), X2(), …, Xn(),…} = {Xn()}, .

For a specific , {Xn()} is a sequence of

numbers that might or might not converge. The notion of convergence of a random sequence can be given several interpretations.

Sure convergence (convergence everywhere)

The sequence of random variables {Xn()} converges surely to the random

variable X() if the sequence of functions Xn() converges to X() as n for all

, i.e.,

Xn() X() as n for all .

Almost-sure convergence (Convergence with probability 1)

Mean-square convergence

Convergence in probability

Convergence in distribution

Remarks Convergence with probability one applies

to the individual realizations of the random process. Convergence in probability does not.

The weak law of large numbers is an example of convergence in probability.

The strong law of large numbers is an example of convergence with probability 1.

The central limit theorem is an example of convergence in distribution.

Weak Law of Large Numbers (WLLN)

Strong Law of Large Numbers (SLLN)

The Central Limit Theorem

Venn diagram of relation of types of convergence

Note that even sure convergence may not imply mean square convergence.

Example

Ergodic Theorem

The Mean-Square Ergodic Theorem

The above theorem shows that one can expect a sample average to converge to a constant in mean square sense if and only if the average of the means converges and if the memory dies out asymptotically, that is , if the covariance decreases as the lag increases.

Mean-Ergodic Processes

Strong or Individual Ergodic Theorem

Examples of Stochastic Processes

iid random process A discrete time random process {X(t), t = 1, 2, …} is said to be independent and identically distributed (iid) if any finite number, say k, of random variables X(t1), X(t2), …, X(tk) are mutually independent and have a common cumulative distribution function FX() .

The joint cdf for X(t1), X(t2), …, X(tk) is

given by

It also yields

where p(x) represents the common probability mass function.

)()()(

,,,),,,(

21

221121,,, 21

kXXX

kkkXXX

xFxFxF

xXxXxXPxxxFk

)()()(),,,( 2121,,, 21 kXXXkXXX xpxpxpxxxpk

Random walk process

Let 0 denote the probability mass

function of X0. The joint probability of

X0, X1, Xn is

)|()|()(

)()()(

)()()(

,,,

),,,(

10100

10100

101100

101100

1100

nn

nn

nnn

nnn

nn

xxPxxPx

xxfxxfx

xxPxxPxXP

xxxxxXP

xXxXxXP

)|(

)|()|()(

)|()|()|()(

),,,(

),,,,(

),,,|(

1

10100

110100

1100

111100

110011

nn

nn

nnnn

nn

nnnn

nnnn

xxP

xxPxxPx

xxPxxPxxPx

xXxXxXP

xXxXxXxXP

xXxXxXxXP

The property

is known as the Markov property.

A special case of random walk: the Brownian motion.

)|(),,,|( 1110011 nnnnnnnn xXxXPxXxXxXxXP

Gaussian process A random process {X(t)} is said to be a

Gaussian random process if all finite collections of the random process, X1=X(t1), X2=X(t2), …, Xk=X(tk), are jointly Gaussian random variables for all k, and all choices of t1, t2, …, tk.

Joint pdf of jointly Gaussian random variables X1, X2, …, Xk:

Time series – AR random process

The Brownian motion (one-dimensional, also known as random walk)

Consider a particle randomly moves on a real line.

Suppose at small time intervals the particle jumps a small distance randomly and equally likely to the left or to the right.

Let be the position of the particle on the real line at time t.

)(tX

Assume the initial position of the particle is at the origin, i.e.

Position of the particle at time t can be expressed as where are independent random variables, each having probability 1/2 of equating 1 and 1.

( represents the largest integer not exceeding .)

0)0( X

]/[21)( tYYYtX

,, 21 YY

/t/t

Distribution of X(t)

Let the step length equal , then

For fixed t, if is small then the distribution of is approximately normal with mean 0 and variance t, i.e., .

]/[21)( tYYYtX

)(tX

tNtX ,0~)(

Graphical illustration of Distribution of X(t)

Time, t

PDF of X(t)

X(t)

If t and h are fixed and is sufficiently small then

httt

httt

tht

YYY

YYY

YYYYYYtXhtX

2

]/)[(2]/[1]/[

]/[21]/)[(21)()(

Distribution of the displacement

The random variable is normally distributed with mean 0 and variance h, i.e.

)()( tXhtX

)()( tXhtX

duh

u

hxtXhtXP

x

2exp

2

1)()(

2

Variance of is dependent on t, while variance of is not.

If , then ,

are independent random variables.

)(tX

)()( tXhtX

mttt 2210 )()( 12 tXtX

,),()( 34 tXtX )()( 122 mm tXtX

t

X

Covariance and Correlation functions of )(tX

t

YYYE

YYYYYYYYYE

YYYYYYE

htXtXEhtXtXCov

t

httttt

htt

2

21

2121

2

21

2121

)()()(),(

htt

t

htt

htXtXCov

htXtXCorrel

)(),(

)(),(