+ All Categories
Home > Documents > §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1...

§4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1...

Date post: 28-Dec-2015
Category:
Upload: vincent-harris
View: 247 times
Download: 0 times
Share this document with a friend
Popular Tags:
23
§4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel
Transcript
Page 1: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4 Continuous source and Gaussian channel

§4.1 Continuous source

§4.2 Gaussian channel

§4.1 Continuous source

§4.2 Gaussian channel

Page 2: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.1 Continuous source

1. Differential entropy

Definition:

Let X be a random variable with cumulative distribution function F(x) = Pr(X≤x). If F(x) is continuous, the random variable is said to be continuous. Let when the derivative is defined. If , then p(x) is called the probability density function for X. The set where p(x) > 0 is called the support set of X.

( ) '( )p x F x( ) 1p x dx

Page 3: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.1 Continuous source

1. Differential entropy

Definition:

The differential entropy h(X) of a continuous random variable X with a density p(x) is defined as

1( ) ( ) log

( )Sh X p x dx

p x

where S is the support set of the random variable.

Page 4: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.1 Continuous source

1. Differential entropy

21( ) log(2 ) ( )

2h X e bits

Example 4.1.1

(X~N (m,σ2), Normal distribution)please calculate the differential entropy.

),2

)(exp(

2

1)(~Let 2

2

mx

xpX

Page 5: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.1 Continuous source

1. Differential entropyDefinition:

The differential entropy of a set X,Y of random variables with density p(xy) is defined as

1( ) ( ) log

( )XY

S Sh XY p xy dxdy

p xy

If X,Y have a joint density p(xy), we can define the conditional differential entropy h(X|Y) as

1( | ) ( ) log

( | )XY

S Sh X Y p xy dxdy

p x y

Page 6: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.1 Continuous source

2. Properties of differential entropy

1) h (XY) = h(X) + h(Y|X) = h(Y) + h(X|Y)

)()|(),()|( YhXYhXhYXh

)()()( YhXhXYh

Page 7: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.1 Continuous source

2. Properties of differential entropy

2) h(X) can be negative.

Example 4.1.2

else

bxaabxp

,0

,1

)(If (b-a)<1, h(X) < 0.

Consider a random variable distributed uniformly from a to b.

1( ) log( ) log( )

b

ah X b a dx b a

b a

Page 8: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.1 Continuous source

2. Properties of differential entropy

3) h(X) is a convex function of the input probabilities p(x),it has the maximum.

Theorem 4.1

If the peak power of the random variable X is restricted, the maximizing distribution is the uniform distribution.

If the average power of the random variable X is restricted, the maximizing distribution is the normal distribution.

Page 9: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.1 Continuous source

2. Properties of differential entropy4) let Y=g(X), the differential entropy of Y may be different with h(X).

Example 4.1.3Let X is a random variable distributed uniformly from -1 to 1, and Y=2X. h(X)=? h(Y)=?

Theorem 4.2 aXhaXh log)()(

Theorem 4.3 )()( XhXah

Page 10: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

Review

• KeyWords:

Differential entropy 1( ) ( ) log

( )Sh X p x dx

p x

Chain rule of differential entropy

Conditioning reduces entropy

Independent bound of differential entropy

may be negative convex functiontransformative

Page 11: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

Homework

1. Prove the following conclusions:

) ( ) ( )b h a X h X

a) h (XY) = h(X) + h(Y|X) = h(Y) + h(X|Y)

Page 12: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4 Continuous source and Gaussian channel

§4.1 Continuous source

§4.2 Gaussian channel

Page 13: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.2 Gaussian channel

1.The model of Gaussian channel

X

Z

Y

Normal, mean 0,variance σz

2

Y=X+Z

X and Z are independent

Page 14: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

2. Average mutual information

§4.2 Gaussian channel

I(X;Y) = h(Y) – h(Y|X)

= h(Y) – h(Z|X) = h(Y) – h(Z)

(Y=X+Z)

Let X~N(0,σx2),

Example 4.2.1

Y~N(0,σx2+σz

2),

2 21( ) log(2 ( ))

2 x zh Y e

2 2

2

1( ; ) log

2x z

z

I X Y

1log( )

2

P N

N

Page 15: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.2 Gaussian channel

3. The channel capacity

Definition: The information capacity of the Gaussian channel

with power constraint P is

);(max][:)( 2

YXICPXExp

I(X;Y) = h(Y) – h(Z) 1( ) log 2

2h Z eN

Page 16: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

( )

1max[ ( ) log 2 ]

2p xC h Y eN

1 1log 2 ( ) log 2

2 2e P N eN

1log( )

2

P N

N

1log(1 )

2

P

N

3. The channel capacity

§4.2 Gaussian channel

2( [ ] )E X P

Page 17: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

1log(1 )

2

PC

N

3. The channel capacity

§4.2 Gaussian channel

Thinking about the band-limited channels, transmission bandwidth is W,

20( )zN N W

0

1log(1 )

2

PC

N W (bits/sample )

There are 2W samples per second,

0

log(1 )

tC NC

PW

N W

Page 18: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

4. Shannon’s formula

§4.2 Gaussian channel

0

log(1 )t

PC W

N W

(bits/sec)

Shannon’s famous expression for the capacity of a band-limited, power-limited Gaussian channel.

Page 19: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.2 Gaussian channel

4. Shannon’s formula

Remarks:

1 ) Ct 、 W 、 SNR can be interchanged.

2 ) 1, 0tSNR C

0

log(1 )t

PC W

N W

Page 20: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.2 Gaussian channel

4. Shannon’s formula

For infinite bandwidth channels

0

0 0

lim lim log(1 )tW W

WNP PC

N P WN

0 0

lim log 1.4427tW

P PC e

N N

0

1.4427P

N

0/ NP

0/ NP

Ct (bps)

W

3 ) shannon limit

0

1lim ln(1 ) 1x

xx

Page 21: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

§4.2 Gaussian channel

4. Shannon’s formula

Let Eb is the energy per bit, then 0

log(1 )t

PC W

N W

)1log(0N

E

W

C

W

C btt

WCN

E

t

WCb

t

/

12 /

0

W As

dBWCN

E

t

WC

W

bt

6.12ln/

12lim

/

min0

Page 22: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

Review

• KeyWords:

Capacity of Gaussian channel

(Band limited, power limited)Shannon’s fomula

Shannon limit

Information rate of Gaussian channel

Page 23: §4 Continuous source and Gaussian channel §4.1 Continuous source §4.2 Gaussian channel §4.1 Continuous source §4.2 Gaussian channel.

Homework

1. In image transmission, there are 2.25*106 pixels per frame.Reproducing image needs 4 bits per pixel (assume that eachbit has equal probability to choose ‘0’ and ‘1’). Compute the channel bandwidth needed for transmitting 30 frames image per second . (P/N = 30dB)

2. Consider a power-limited Gaussian channel , bandwidth is3kHz, and (P + N)/N = 10dB. (1) Compute the maximum rate of this channel. (bps) (2) If SNR decreases to 5 dB, give the channel bandwidth with the same maximum rate.


Recommended