Discrete Signal Processing

Post on 03-Dec-2014

633 views 0 download

Tags:

description

 

transcript

DISCRETE RANDOM SIGNAL PROCESSING

ADSPUNIT-I

Contents are…Definition -DTPBernoulli’s ProcessMoments-Ensemble AveragesStationary Process-WSSMatrix FormsParseval’s TheoremWeiner-Khinchine relationPSDFiltering of Random ProcessSpectral factorizationBias-ConsistencySpecial types of RPYule-walker Equation

Discrete Time Random Process:A random variable may be thought of as

mapping from sample space of an experiment into a set of real or complex values.

A Discrete time random process may be thought of mapping from sample space Ω into a set of discrete time signals.

It is nothing but an Indexed sequence of random variables.

Example: Tossing a coin, Rolling a die

Bernoulli’s Process:The outcome of an event does not affect the

outcome of the other event at any time then the process is called as Bernoulli’s Process.

The Moments are,Mean :The average of outcomes

µ=1/n ∑ x(i), i=1 to nVariance: How far the random values is away

from the central mean. σ2 =1/n ∑ (x-µ)2, i=1 to n

Skewness: It deals symmetry with the mean values

S= ∑ (x-µ)3/σ3

Kurtosis: Flatness or stability of the systemK= ∑ (x-µ)4/σ4

ERGODICITY:When the time average of the process is equal to

the ensemble average. It is said to be “ergodic”. ie, E(X)= Complement of X

ENSEMBLE AVERAGES:

Mean: Mx(n)= E[x(n)]Variance: σ2x(n)=E[|x(n)-Mx(n)|2]Auto Correlation :Finding the relationship

between the random variables in the same process.

rx(k,l)=E[x(k) x*(l)]Auto Covariance: Cx(k,l)=E[|x(k)-Mx(k)|,|x(l)-

Mx*(l)|] Cross Correlation: rxy(k,l)=E[x(k) y*(l)]Cross Covariance: Cxy(k,l)=E[|x(k)-Mx(k)|,|y(l)-

My*(l)|]

RELATIONSRelation between rx &Cx:

Cx(k,l)=rx(k,l)-Mx(k) Mx*(l)Mean=0,

Cx(k,l)=rx(k,l)Relation between rxy &Cxy:

Cxy(k,l) =rxy(k,l)-Mx(k) My*(l)Mean=0,

Cxy(k,l)=rxy(k,l)•If the random process is uncorrelated means Cxy(k,l)=0. •If the two random process x(n) & y(n) are said to be orthogonal means rxy(k,l)=0

STATIONARY PROCESSA process is said to be stationary when all

the statistical averages (Mean, Variance etc.) are independent of time

i.e, For first order, Mx(n)=Mx

σ2x(n)=σ2xFor second order, rx(k,l)= rx(k-l,0)

rx(k,l)= rx(k-l)

Example: Quantization Error

WIDE SENSE STATIONARY PROCESS:Case:1The mean of the process is constant Mx.The autocorrelation of the process depends

on the difference on k,l.(k-l)The variance of the process is finite.Case:2x(n),y(n) a said to be jointly WSS if they are

independently WSS.rxy(k-l)=E[x(k) ,y*(l)]

PROPERTIES OF WSS & AUTO

CORRELATION:1. Symmetry rx(k)=rx*(-k)2. Mean square value rx(0)=E[|

x(n)|2]≥0 3. Maximum Value rx(k) ≤ rx(0)

4. Periodicity E[|x(n)-x(n-ko)|2]For the auto correlation Rxx…

MATRIX AND ITS PROPERTIESThe auto correlation & auto covariance can

be expressed in the form of matrix.PROPERTIES:The autocorrelation of a WSS process x(n)

is a Hermitian Toeplitz matrix.Non negative & definite.The eigen value λk are real value and non

negative.

IMPORTANT MATRIX FORMSOrthogonal Matrix A T =A-1 Hermitian Matrix [A*]T=[AT]*Skew Hermitian Matrix A=-AH

Toeplitz Matrix => All the diagonal elements are same.

Henkal Matrix M+N-1

PARSEVAL’S THEOREM (OR) RAYLEIGH ENERGY FORMULA

The sum or integral of the square of the function is equal to the sum or integral of square of the transform.

That is E<x,x>

WEINER KHINCHINE RELATIONFor a well behaved stationary random

process the power spectrum is equal to the Fourier transform of the autocorrelation function.

POWER SPECTRAL DENSITYThe PSD of the process is written by,

Px(ejw)=∑rx(k) e(-jwk) , k=-∞ to ∞

Power spectrum of x(n),Px (z)=∑ rx(k) z-k , k=-∞ to ∞

FILTERING OF RANDOM PROCESS

A linear shift-invariant (LSI) system (or filter) with a unit sample response h(n), applied to the case of a deterministic signal. The input is x(n) and the output is y(n).

Py ( z) = Px ( z)H ( z)H * (1/ z* )

SPECTRAL FACTORIZATION

Px ( z) = σ 02 H ( z)H * (1/ z* ) .

Wold Decomposition Theorem:

A general random process can be written as a sum of a regular random process xr (n)and a predictable process x

p (n) ,

x(n) = xr (n) + x p (n) ,

Bias-ConsistencyThe difference between the expected value of the

estimate and the actual value is called the ‘Bias’ B.B=ϴ-E[ϴ^

N]

ϴ - Actual Valueϴ^

N- Estimate ValueIf an estimate is biased ,Asymptotically Biased, Lt E[ϴ^

N]=0 N->∞

If an estimate is consistent, Mean Square Convergence

Lt |ϴ-E[ϴ^N]|2=0

N->∞

SPECIAL TYPES OF RPTypes are,ARMA Process –ARMA(p,q) AR Process (Auto Regressive)-ARMA (p,0)MA Process (Moving average)-ARMA (0,q)

YULE-WALKER EQUATION rx(k)+∑ap(l)rx(k-l) = σ2

vcq(k),0≤k≤q

0, k>q

l=-∞ to ∞