+ All Categories
Home > Documents > Signals 7

Signals 7

Date post: 06-Jul-2018
Category:
Upload: shivanshu-trivedi
View: 213 times
Download: 0 times
Share this document with a friend

of 32

Transcript
  • 8/17/2019 Signals 7

    1/32

    INTRODUCTION TO

    SIGNAL PROCESSING

    Lecture 7

    Iasonas KokkinosEcole Centrale Paris

    Introduction to Random Signals

  • 8/17/2019 Signals 7

    2/32

    !  Inherent in the signal generation

    !  Noise due to imaging

    Prostate MRI Denoised Version

    !  Thermal noise:

    !  Movement of electrons inside resistor in equilibrium

    Sources of randomness 

  • 8/17/2019 Signals 7

    3/32

    !  Doppler Radar

    !  Send out wave of frequency ! 

    !  Listen to its reflection

    !  Reflected frequency will depend on targets’ speed 

    !  Noise:

    !  Birds, insects (‘angels’)

    !  Sea motion

    !  Wind on trees

    !  …

    !  How can we decide if a target is present?

    Sources of randomness 

  • 8/17/2019 Signals 7

    4/32

    !  Speech generation: deterministic system

    !  Complex mechanical models

    !  Turbulence for fricatives

    !  For engineering: simple model + ‘noise’

    !  Texture modeling in images

    !  Same feel as natural images

    !  Sufficient for computer vision, maybe not for graphics

    Sources of randomness 

  • 8/17/2019 Signals 7

    5/32

    Speech signal:

    !  Scatter plots of successive samples

    Random Signals 

  • 8/17/2019 Signals 7

    6/32

    •  What can we learn from the one signal about the other?

    • 

    How can we understand if signals are regular?

    Random Signals 

  • 8/17/2019 Signals 7

    7/32

    •  How can we identify when the signals’ behavior changes?

    Random Signals 

  • 8/17/2019 Signals 7

    8/32

    •  How can we remove daily fluctuations?

    •  How can we predict the signal?

    Random Signals 

  • 8/17/2019 Signals 7

    9/32

    !  Extract information for understanding & classification!  Spectral Estimation

    !  Parametric Modelling

    !  Applications:

    !  Signal Compression/Transmission

    !  Pattern recognition (e.g. speech recognition)!  Signal Detection (e.g. presence of sinusoid/target)

    !  Signal Estimation (e.g. frequency of sinusoid/speed)

    Random Signal Analysis 

  • 8/17/2019 Signals 7

    10/32

    !  Denoising

    !  Signal corrupted by noise

    !  Goal: recover signal

    !  Signal: possibly random

    !  Tracking

    !  Dynamical system

    !  Noisy observations of state

    !  Goal: Recover actual state

    Random Signal Processing 

  • 8/17/2019 Signals 7

    11/32

    7th Lecture Layout

    • 

    Randomness in Signals•  Introduction to Stochastic Processes

    •  First and Second order statistics

    •  Power Spectrum

    •  LTI Systems & Stochastic Processes

  • 8/17/2019 Signals 7

    12/32

    Stochastic Process examples

    • 

    Motion of a particle in a liquid

    •  ! : the particle we chose to observe

    • 

    Voltage of AC generator with unknown phase

    • 

    ! : the signal we get by plugging in

    !  Also known as: random process, random signal

  • 8/17/2019 Signals 7

    13/32

    Stochastic Process

    • 

    Family of signals

    •  Second order distribution:

    •  Density:

    •  N-th order distribution: joint distribution

    of

  • 8/17/2019 Signals 7

    14/32

     

    !  !: Realization of the random process

    !  For any !: x is a discrete-time signal

    !  At any time n: x[n] is a random variable

    Discrete-time Stochastic Process

  • 8/17/2019 Signals 7

    15/32

    Description of Stochastic Processes

    • 

    In general: joint distribution for any orderand any time.

    •  In practice: only a few statistics are used

    •  Mean at time t: expected value of x(t):

    •  Autocorrelation:

  • 8/17/2019 Signals 7

    16/32

    7th Lecture Layout

    • 

    Randomness in Signals•  Introduction to Stochastic Processes

    •  First- and Second- order statistics

    •  Power Spectrum

    •  LTI Systems & Stochastic Processes

    •  Spectral Factorization

    •  Normal & Predictable Processes

  • 8/17/2019 Signals 7

    17/32

    Discrete-time stochastic process x[n]

    • 

    Expected Value

    •  Variance

    •  Autocorrelation

    •  Autocovariance

  • 8/17/2019 Signals 7

    18/32

    •  Strict-sense-Stationary process:

    • Joint distribution of any k observationsdoes not change with time

    •  Wide-sense-stationary (WSS) process:

    •  Constant mean

    •  Autocorrelation is a function of

    •  Bounded variance

    Stationary Processes

    Autocorrelation of WSS processes

  • 8/17/2019 Signals 7

    19/32

    Autocorrelation of WSS processes

    •  Properties

    • 

    Hermitian Symmetry:

    •  Maximum:

    • 

    Average Power:

    •  Positive Semidefinite:

  • 8/17/2019 Signals 7

    20/32

    •  Consider

    • 

    Autocorrelation matrix:

    •  Properties:

    • 

    Hermitian:

    •  Toeplitz:

    •  Positive Semidefinite:

    •  For any vector ,

    •  Autocovariance Matrix: 

    where

    Autocorrelation Matrix

    WSS White Noise Process v(n)

  • 8/17/2019 Signals 7

    21/32

    WSS White Noise Process v(n)

    •  WSS White Noise:

    • 

    Zero mean:

    •  Auto-correlation:

    •  Autocorrelation matrix: Diagonal

    • 

    Observations are uncorrelated

    •  White Gaussian Noise (WGN)

    • 

    Each sample is independent of other samples•

      Each sample follows zero-mean Gaussiandistribution

    White Gaussian Noise

  • 8/17/2019 Signals 7

    22/32

    White Gaussian Noise•  White noise: sequence of uncorrelated

    random variables

    •  WGN: Gaussian variables

    WGN

    Figure Credit: Manolakis , Ingle & Kogon

    A t l ti t i f i id + WGN

  • 8/17/2019 Signals 7

    23/32

    Autocorrelation matrix of sinusoid + WGN

    • Sinusoid signal:

    •  Autocorrelation of noise:

    •  Autocorrelation of x[n]:

    •  Autocorrelation matrix will have the form:

    where 

  • 8/17/2019 Signals 7

    24/32

    DT-processes x[n], y[n]

    •  x[n] = rainfall on day n

    • 

    y[n] = #umbrellas sold on day n

    •  Cross-Correlation

    • 

    Cross-Covariance

    • 

    Relation between Covariance & Correlation:

    •  Consider: y[n] temperature on day n.

    th

  • 8/17/2019 Signals 7

    25/32

    7th Lecture Layout

    • Randomness in Signals

    •  Introduction to Stochastic Processes

    •  First and Second order statistics

    •  Power Spectrum

    •  LTI Systems & Stochastic Processes

    P S

  • 8/17/2019 Signals 7

    26/32

    Power Spectrum

    •  Definition of power spectrum of a WSS

    stochastic process x[n]:

    •  Wiener-Khintchine theorem:

    •  The power spectrum of a WSS stochastic

    process equals the DTFT of its autocorrelation

    P S t f hit i

  • 8/17/2019 Signals 7

    27/32

    Power Spectrum of white noise

    • Autocorrelation function:

    •  Power Spectrum:

    •  Equal power on all frequencies

    •  Just like white Light

    •  Red, pink, violet noises..

    7th L t L t

  • 8/17/2019 Signals 7

    28/32

    7th Lecture Layout

    • Randomness in Signals

    •  Introduction to Stochastic Processes

    •  First and Second order statistics

    •  Power Spectrum

    • 

    LTI Systems & Stochastic Processes 

    St h ti P & LTI S t

  • 8/17/2019 Signals 7

    29/32

    Stochastic Processes & LTI Systems

    Stationary Processes and LTI systems

  • 8/17/2019 Signals 7

    30/32

    Stationary Processes and LTI systems

    • WSS process x(n) drives an LTI system:

    • 

    Mean:

    •  Input-output crosscorrelation:

    Stationary Processes and LTI systems

  • 8/17/2019 Signals 7

    31/32

    Stationary Processes and LTI systems

    •  Output autocorrelation:

    •  Power Spectrum:

    S t & R d Si l

  • 8/17/2019 Signals 7

    32/32

    Systems & Random Signals

    •  System: Deterministic

    • 

    Input: Stochastic Process

    •  Output: Stochastic Process

    •  System’s Effect:


Recommended