+ All Categories
Home > Documents > Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise...

Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise...

Date post: 27-Mar-2021
Category:
Upload: others
View: 13 times
Download: 0 times
Share this document with a friend
43
University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 9: Gaussian channel
Transcript
Page 1: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 9: Gaussian channel

Page 2: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 9 outline

• Definitions

• Capacity of Gaussian noise channels: achievability and converse

• Bandlimited channels

• Parallel Gaussian channels

• Colored Gaussian noise channels

• Gaussian channels with feedback

Page 3: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Motivation

• Our goal is to determine the capacity of an AWGN channel

YX = h X + Nh

N Gaussian noise ~ N(0,PN)

Wireless channel with fading

time time

Page 4: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Motivation

• Our goal is to determine the capacity of an AWGN channel

YX = h X + Nh

N Gaussian noise ~ N(0,PN)

Wireless channel with fading

C = 12 log

�|h|2P+PN

PN

= 12 log (1 + SNR) (bits/channel use)

Page 5: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Definitions

+

Can capacity be infinite?

Page 6: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Definitions

+

Page 7: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Thought experiment: 1 bit over AWGN channel

• Send 1 bit with power constraint P. How would you do it, and what is the associated probability of error Pe?

+Input power constraint P

Gaussian noise variance N

• Turn a Gaussian channel into a discrete binary symmetric channel with crossover probability Pe!

o o

1 1

f

f

1-f

1-f

Page 8: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Definitions - information capacity

+

Input power constraint P

Gaussian noise variance N

Page 9: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Definitions - Gaussian code

+Encoder DecoderW: 1... M Ŵ: 1... M

Page 10: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Definitions: achievable rate and capacity

Page 11: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Intuition about why it works - sphere packing

Page 12: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Intuition about why it works - sphere packing

Page 13: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Channel coding: achievability

• We will prove achievability, then the converse

• Need concepts of typical sets

• Need idea that Gaussians maximize entropy for a given variance constraint

Page 14: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Typical sets

Page 15: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Properties of jointly typical sets

Page 16: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Achievability ⇐

Page 17: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Converse

+Encoder DecoderW: 1... M Ŵ: 1... M

Page 18: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Bandlimited Gaussian Channels

h(t) H(ω)

W-W

Page 19: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Bandlimited Gaussian Channels

Page 20: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Bandlimited Gaussian channels

Page 21: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Bandlimited Gaussian Channel

Page 22: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Example: telephone channel

• Telephone signals bandlimited to 3300Hz.

• SNR is 33dB.

• What is capacity of a telephone line?

Page 23: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Parallel Gaussian channels

9.4 PARALLEL GAUSSIAN CHANNELS 275

Z1

Y1X1

Zk

YkXk

FIGURE 9.3. Parallel Gaussian channels.

We calculate the distribution that achieves the information capacity forthis channel. The fact that the information capacity is the supremum ofachievable rates can be proved by methods identical to those in the proofof the capacity theorem for single Gaussian channels and will be omitted.

Since Z1, Z2, . . . , Zk are independent,

I (X1, X2, . . . , Xk ; Y1, Y2, . . . , Yk)

= h(Y1, Y2, . . . , Yk) − h(Y1, Y2, . . . , Yk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk) (9.68)

= h(Y1, Y2, . . . , Yk) −∑

i

h(Zi) (9.69)

≤∑

i

h(Yi) − h(Zi) (9.70)

≤∑

i

12

log(

1 + Pi

Ni

), (9.71)

Question: how to distribute power across the parallel

channels?

Page 24: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Parallel Gaussian channels

9.4 PARALLEL GAUSSIAN CHANNELS 275

Z1

Y1X1

Zk

YkXk

FIGURE 9.3. Parallel Gaussian channels.

We calculate the distribution that achieves the information capacity forthis channel. The fact that the information capacity is the supremum ofachievable rates can be proved by methods identical to those in the proofof the capacity theorem for single Gaussian channels and will be omitted.

Since Z1, Z2, . . . , Zk are independent,

I (X1, X2, . . . , Xk ; Y1, Y2, . . . , Yk)

= h(Y1, Y2, . . . , Yk) − h(Y1, Y2, . . . , Yk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk) (9.68)

= h(Y1, Y2, . . . , Yk) −∑

i

h(Zi) (9.69)

≤∑

i

h(Yi) − h(Zi) (9.70)

≤∑

i

12

log(

1 + Pi

Ni

), (9.71)

Page 25: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Parallel Gaussian channels

9.4 PARALLEL GAUSSIAN CHANNELS 275

Z1

Y1X1

Zk

YkXk

FIGURE 9.3. Parallel Gaussian channels.

We calculate the distribution that achieves the information capacity forthis channel. The fact that the information capacity is the supremum ofachievable rates can be proved by methods identical to those in the proofof the capacity theorem for single Gaussian channels and will be omitted.

Since Z1, Z2, . . . , Zk are independent,

I (X1, X2, . . . , Xk ; Y1, Y2, . . . , Yk)

= h(Y1, Y2, . . . , Yk) − h(Y1, Y2, . . . , Yk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk) (9.68)

= h(Y1, Y2, . . . , Yk) −∑

i

h(Zi) (9.69)

≤∑

i

h(Yi) − h(Zi) (9.70)

≤∑

i

12

log(

1 + Pi

Ni

), (9.71)

Page 26: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Parallel Gaussian channels

9.4 PARALLEL GAUSSIAN CHANNELS 275

Z1

Y1X1

Zk

YkXk

FIGURE 9.3. Parallel Gaussian channels.

We calculate the distribution that achieves the information capacity forthis channel. The fact that the information capacity is the supremum ofachievable rates can be proved by methods identical to those in the proofof the capacity theorem for single Gaussian channels and will be omitted.

Since Z1, Z2, . . . , Zk are independent,

I (X1, X2, . . . , Xk ; Y1, Y2, . . . , Yk)

= h(Y1, Y2, . . . , Yk) − h(Y1, Y2, . . . , Yk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk) (9.68)

= h(Y1, Y2, . . . , Yk) −∑

i

h(Zi) (9.69)

≤∑

i

h(Yi) − h(Zi) (9.70)

≤∑

i

12

log(

1 + Pi

Ni

), (9.71)

9.5 CHANNELS WITH COLORED GAUSSIAN NOISE 277

Power

Channel 1 Channel 2 Channel 3

P1

n

P2

N1

N2

N3

FIGURE 9.4. Water-filling for parallel channels.

noise. When the available power is increased still further, some of thepower is put into noisier channels. The process by which the power isdistributed among the various bins is identical to the way in which waterdistributes itself in a vessel, hence this process is sometimes referred toas water-filling.

9.5 CHANNELS WITH COLORED GAUSSIAN NOISE

In Section 9.4, we considered the case of a set of parallel independentGaussian channels in which the noise samples from different channelswere independent. Now we will consider the case when the noise is depen-dent. This represents not only the case of parallel channels, but also thecase when the channel has Gaussian noise with memory. For channelswith memory, we can consider a block of n consecutive uses of the chan-nel as n channels in parallel with dependent noise. As in Section 9.4, wewill calculate only the information capacity for this channel.

Let KZ be the covariance matrix of the noise, and let KX be the inputcovariance matrix. The power constraint on the input can then be writ-ten as

1n

i

EX2i ≤ P, (9.79)

or equivalently,

1n

tr(KX) ≤ P. (9.80)

Page 27: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Waterfilling

9.4 PARALLEL GAUSSIAN CHANNELS 275

Z1

Y1X1

Zk

YkXk

FIGURE 9.3. Parallel Gaussian channels.

We calculate the distribution that achieves the information capacity forthis channel. The fact that the information capacity is the supremum ofachievable rates can be proved by methods identical to those in the proofof the capacity theorem for single Gaussian channels and will be omitted.

Since Z1, Z2, . . . , Zk are independent,

I (X1, X2, . . . , Xk ; Y1, Y2, . . . , Yk)

= h(Y1, Y2, . . . , Yk) − h(Y1, Y2, . . . , Yk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk) (9.68)

= h(Y1, Y2, . . . , Yk) −∑

i

h(Zi) (9.69)

≤∑

i

h(Yi) − h(Zi) (9.70)

≤∑

i

12

log(

1 + Pi

Ni

), (9.71)

9.5 CHANNELS WITH COLORED GAUSSIAN NOISE 277

Power

Channel 1 Channel 2 Channel 3

P1

n

P2

N1

N2

N3

FIGURE 9.4. Water-filling for parallel channels.

noise. When the available power is increased still further, some of thepower is put into noisier channels. The process by which the power isdistributed among the various bins is identical to the way in which waterdistributes itself in a vessel, hence this process is sometimes referred toas water-filling.

9.5 CHANNELS WITH COLORED GAUSSIAN NOISE

In Section 9.4, we considered the case of a set of parallel independentGaussian channels in which the noise samples from different channelswere independent. Now we will consider the case when the noise is depen-dent. This represents not only the case of parallel channels, but also thecase when the channel has Gaussian noise with memory. For channelswith memory, we can consider a block of n consecutive uses of the chan-nel as n channels in parallel with dependent noise. As in Section 9.4, wewill calculate only the information capacity for this channel.

Let KZ be the covariance matrix of the noise, and let KX be the inputcovariance matrix. The power constraint on the input can then be writ-ten as

1n

i

EX2i ≤ P, (9.79)

or equivalently,

1n

tr(KX) ≤ P. (9.80)

C = ?

Page 28: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Colored Gaussian noise

9.4 PARALLEL GAUSSIAN CHANNELS 275

Z1

Y1X1

Zk

YkXk

FIGURE 9.3. Parallel Gaussian channels.

We calculate the distribution that achieves the information capacity forthis channel. The fact that the information capacity is the supremum ofachievable rates can be proved by methods identical to those in the proofof the capacity theorem for single Gaussian channels and will be omitted.

Since Z1, Z2, . . . , Zk are independent,

I (X1, X2, . . . , Xk ; Y1, Y2, . . . , Yk)

= h(Y1, Y2, . . . , Yk) − h(Y1, Y2, . . . , Yk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk) (9.68)

= h(Y1, Y2, . . . , Yk) −∑

i

h(Zi) (9.69)

≤∑

i

h(Yi) − h(Zi) (9.70)

≤∑

i

12

log(

1 + Pi

Ni

), (9.71)

What does white noise correspond to?

Page 29: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Colored Gaussian noise

9.4 PARALLEL GAUSSIAN CHANNELS 275

Z1

Y1X1

Zk

YkXk

FIGURE 9.3. Parallel Gaussian channels.

We calculate the distribution that achieves the information capacity forthis channel. The fact that the information capacity is the supremum ofachievable rates can be proved by methods identical to those in the proofof the capacity theorem for single Gaussian channels and will be omitted.

Since Z1, Z2, . . . , Zk are independent,

I (X1, X2, . . . , Xk ; Y1, Y2, . . . , Yk)

= h(Y1, Y2, . . . , Yk) − h(Y1, Y2, . . . , Yk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk) (9.68)

= h(Y1, Y2, . . . , Yk) −∑

i

h(Zi) (9.69)

≤∑

i

h(Yi) − h(Zi) (9.70)

≤∑

i

12

log(

1 + Pi

Ni

), (9.71)

Page 30: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Colored Gaussian noise

9.4 PARALLEL GAUSSIAN CHANNELS 275

Z1

Y1X1

Zk

YkXk

FIGURE 9.3. Parallel Gaussian channels.

We calculate the distribution that achieves the information capacity forthis channel. The fact that the information capacity is the supremum ofachievable rates can be proved by methods identical to those in the proofof the capacity theorem for single Gaussian channels and will be omitted.

Since Z1, Z2, . . . , Zk are independent,

I (X1, X2, . . . , Xk ; Y1, Y2, . . . , Yk)

= h(Y1, Y2, . . . , Yk) − h(Y1, Y2, . . . , Yk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk) (9.68)

= h(Y1, Y2, . . . , Yk) −∑

i

h(Zi) (9.69)

≤∑

i

h(Yi) − h(Zi) (9.70)

≤∑

i

12

log(

1 + Pi

Ni

), (9.71)

Page 31: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Colored Gaussian noise - optimal powers

9.4 PARALLEL GAUSSIAN CHANNELS 275

Z1

Y1X1

Zk

YkXk

FIGURE 9.3. Parallel Gaussian channels.

We calculate the distribution that achieves the information capacity forthis channel. The fact that the information capacity is the supremum ofachievable rates can be proved by methods identical to those in the proofof the capacity theorem for single Gaussian channels and will be omitted.

Since Z1, Z2, . . . , Zk are independent,

I (X1, X2, . . . , Xk ; Y1, Y2, . . . , Yk)

= h(Y1, Y2, . . . , Yk) − h(Y1, Y2, . . . , Yk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk) (9.68)

= h(Y1, Y2, . . . , Yk) −∑

i

h(Zi) (9.69)

≤∑

i

h(Yi) − h(Zi) (9.70)

≤∑

i

12

log(

1 + Pi

Ni

), (9.71)

Page 32: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Colored Gaussian noise - optimal powers

9.4 PARALLEL GAUSSIAN CHANNELS 275

Z1

Y1X1

Zk

YkXk

FIGURE 9.3. Parallel Gaussian channels.

We calculate the distribution that achieves the information capacity forthis channel. The fact that the information capacity is the supremum ofachievable rates can be proved by methods identical to those in the proofof the capacity theorem for single Gaussian channels and will be omitted.

Since Z1, Z2, . . . , Zk are independent,

I (X1, X2, . . . , Xk ; Y1, Y2, . . . , Yk)

= h(Y1, Y2, . . . , Yk) − h(Y1, Y2, . . . , Yk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk|X1, X2, . . . , Xk)

= h(Y1, Y2, . . . , Yk) − h(Z1, Z2, . . . , Zk) (9.68)

= h(Y1, Y2, . . . , Yk) −∑

i

h(Zi) (9.69)

≤∑

i

h(Yi) − h(Zi) (9.70)

≤∑

i

12

log(

1 + Pi

Ni

), (9.71)

Page 33: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Gaussian channels with feedback9.6 GAUSSIAN CHANNELS WITH FEEDBACK 281

Zi

YiXi

FIGURE 9.6. Gaussian channel with feedback.

The feedback allows the input of the channel to depend on the past valuesof the output.

A (2nR, n) code for the Gaussian channel with feedback consists ofa sequence of mappings xi(W, Y i−1), where W ∈ {1, 2, . . . , 2nR} is theinput message and Y i−1 is the sequence of past values of the output. Thus,x(W, ·) is a code function rather than a codeword. In addition, we requirethat the code satisfy a power constraint,

E

[1n

n∑

i=1

x2i (w, Y i−1)

]

≤ P, w ∈ {1, 2, . . . , 2nR}, (9.99)

where the expectation is over all possible noise sequences.We characterize the capacity of the Gaussian channel is terms of the

covariance matrices of the input X and the noise Z. Because of the feed-back, Xn and Zn are not independent; Xi depends causally on the pastvalues of Z. In the next section we prove a converse for the Gaussianchannel with feedback and show that we achieve capacity if we take Xto be Gaussian.

We now state an informal characterization of the capacity of the channelwith and without feedback.

1. With feedback . The capacity Cn,FB in bits per transmission of thetime-varying Gaussian channel with feedback is

Cn,FB = max1n tr(K(n)

X )≤P

12n

log|K(n)

X+Z||K(n)

Z |, (9.100)

Time-varying = channel with memory!

Page 34: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Gaussian channels with feedback

9.6 GAUSSIAN CHANNELS WITH FEEDBACK 281

Zi

YiXi

FIGURE 9.6. Gaussian channel with feedback.

The feedback allows the input of the channel to depend on the past valuesof the output.

A (2nR, n) code for the Gaussian channel with feedback consists ofa sequence of mappings xi(W, Y i−1), where W ∈ {1, 2, . . . , 2nR} is theinput message and Y i−1 is the sequence of past values of the output. Thus,x(W, ·) is a code function rather than a codeword. In addition, we requirethat the code satisfy a power constraint,

E

[1n

n∑

i=1

x2i (w, Y i−1)

]

≤ P, w ∈ {1, 2, . . . , 2nR}, (9.99)

where the expectation is over all possible noise sequences.We characterize the capacity of the Gaussian channel is terms of the

covariance matrices of the input X and the noise Z. Because of the feed-back, Xn and Zn are not independent; Xi depends causally on the pastvalues of Z. In the next section we prove a converse for the Gaussianchannel with feedback and show that we achieve capacity if we take Xto be Gaussian.

We now state an informal characterization of the capacity of the channelwith and without feedback.

1. With feedback . The capacity Cn,FB in bits per transmission of thetime-varying Gaussian channel with feedback is

Cn,FB = max1n tr(K(n)

X )≤P

12n

log|K(n)

X+Z||K(n)

Z |, (9.100)

Page 35: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Gaussian channels with feedback

9.6 GAUSSIAN CHANNELS WITH FEEDBACK 281

Zi

YiXi

FIGURE 9.6. Gaussian channel with feedback.

The feedback allows the input of the channel to depend on the past valuesof the output.

A (2nR, n) code for the Gaussian channel with feedback consists ofa sequence of mappings xi(W, Y i−1), where W ∈ {1, 2, . . . , 2nR} is theinput message and Y i−1 is the sequence of past values of the output. Thus,x(W, ·) is a code function rather than a codeword. In addition, we requirethat the code satisfy a power constraint,

E

[1n

n∑

i=1

x2i (w, Y i−1)

]

≤ P, w ∈ {1, 2, . . . , 2nR}, (9.99)

where the expectation is over all possible noise sequences.We characterize the capacity of the Gaussian channel is terms of the

covariance matrices of the input X and the noise Z. Because of the feed-back, Xn and Zn are not independent; Xi depends causally on the pastvalues of Z. In the next section we prove a converse for the Gaussianchannel with feedback and show that we achieve capacity if we take Xto be Gaussian.

We now state an informal characterization of the capacity of the channelwith and without feedback.

1. With feedback . The capacity Cn,FB in bits per transmission of thetime-varying Gaussian channel with feedback is

Cn,FB = max1n tr(K(n)

X )≤P

12n

log|K(n)

X+Z||K(n)

Z |, (9.100)

Page 36: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Gaussian channels with feedback

9.6 GAUSSIAN CHANNELS WITH FEEDBACK 281

Zi

YiXi

FIGURE 9.6. Gaussian channel with feedback.

The feedback allows the input of the channel to depend on the past valuesof the output.

A (2nR, n) code for the Gaussian channel with feedback consists ofa sequence of mappings xi(W, Y i−1), where W ∈ {1, 2, . . . , 2nR} is theinput message and Y i−1 is the sequence of past values of the output. Thus,x(W, ·) is a code function rather than a codeword. In addition, we requirethat the code satisfy a power constraint,

E

[1n

n∑

i=1

x2i (w, Y i−1)

]

≤ P, w ∈ {1, 2, . . . , 2nR}, (9.99)

where the expectation is over all possible noise sequences.We characterize the capacity of the Gaussian channel is terms of the

covariance matrices of the input X and the noise Z. Because of the feed-back, Xn and Zn are not independent; Xi depends causally on the pastvalues of Z. In the next section we prove a converse for the Gaussianchannel with feedback and show that we achieve capacity if we take Xto be Gaussian.

We now state an informal characterization of the capacity of the channelwith and without feedback.

1. With feedback . The capacity Cn,FB in bits per transmission of thetime-varying Gaussian channel with feedback is

Cn,FB = max1n tr(K(n)

X )≤P

12n

log|K(n)

X+Z||K(n)

Z |, (9.100)

Page 37: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Gaussian channels with feedback

9.6 GAUSSIAN CHANNELS WITH FEEDBACK 281

Zi

YiXi

FIGURE 9.6. Gaussian channel with feedback.

The feedback allows the input of the channel to depend on the past valuesof the output.

A (2nR, n) code for the Gaussian channel with feedback consists ofa sequence of mappings xi(W, Y i−1), where W ∈ {1, 2, . . . , 2nR} is theinput message and Y i−1 is the sequence of past values of the output. Thus,x(W, ·) is a code function rather than a codeword. In addition, we requirethat the code satisfy a power constraint,

E

[1n

n∑

i=1

x2i (w, Y i−1)

]

≤ P, w ∈ {1, 2, . . . , 2nR}, (9.99)

where the expectation is over all possible noise sequences.We characterize the capacity of the Gaussian channel is terms of the

covariance matrices of the input X and the noise Z. Because of the feed-back, Xn and Zn are not independent; Xi depends causally on the pastvalues of Z. In the next section we prove a converse for the Gaussianchannel with feedback and show that we achieve capacity if we take Xto be Gaussian.

We now state an informal characterization of the capacity of the channelwith and without feedback.

1. With feedback . The capacity Cn,FB in bits per transmission of thetime-varying Gaussian channel with feedback is

Cn,FB = max1n tr(K(n)

X )≤P

12n

log|K(n)

X+Z||K(n)

Z |, (9.100)

• But how much does feedback really give you?

• Let’s find bounds which relate the capacity with feedback to the capacity without feedback

• To do this we’ll need some technical lemmas.....

Page 38: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Gaussian channels with feedback

9.6 GAUSSIAN CHANNELS WITH FEEDBACK 281

Zi

YiXi

FIGURE 9.6. Gaussian channel with feedback.

The feedback allows the input of the channel to depend on the past valuesof the output.

A (2nR, n) code for the Gaussian channel with feedback consists ofa sequence of mappings xi(W, Y i−1), where W ∈ {1, 2, . . . , 2nR} is theinput message and Y i−1 is the sequence of past values of the output. Thus,x(W, ·) is a code function rather than a codeword. In addition, we requirethat the code satisfy a power constraint,

E

[1n

n∑

i=1

x2i (w, Y i−1)

]

≤ P, w ∈ {1, 2, . . . , 2nR}, (9.99)

where the expectation is over all possible noise sequences.We characterize the capacity of the Gaussian channel is terms of the

covariance matrices of the input X and the noise Z. Because of the feed-back, Xn and Zn are not independent; Xi depends causally on the pastvalues of Z. In the next section we prove a converse for the Gaussianchannel with feedback and show that we achieve capacity if we take Xto be Gaussian.

We now state an informal characterization of the capacity of the channelwith and without feedback.

1. With feedback . The capacity Cn,FB in bits per transmission of thetime-varying Gaussian channel with feedback is

Cn,FB = max1n tr(K(n)

X )≤P

12n

log|K(n)

X+Z||K(n)

Z |, (9.100)

Page 39: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Gaussian channels with feedback

9.6 GAUSSIAN CHANNELS WITH FEEDBACK 281

Zi

YiXi

FIGURE 9.6. Gaussian channel with feedback.

The feedback allows the input of the channel to depend on the past valuesof the output.

A (2nR, n) code for the Gaussian channel with feedback consists ofa sequence of mappings xi(W, Y i−1), where W ∈ {1, 2, . . . , 2nR} is theinput message and Y i−1 is the sequence of past values of the output. Thus,x(W, ·) is a code function rather than a codeword. In addition, we requirethat the code satisfy a power constraint,

E

[1n

n∑

i=1

x2i (w, Y i−1)

]

≤ P, w ∈ {1, 2, . . . , 2nR}, (9.99)

where the expectation is over all possible noise sequences.We characterize the capacity of the Gaussian channel is terms of the

covariance matrices of the input X and the noise Z. Because of the feed-back, Xn and Zn are not independent; Xi depends causally on the pastvalues of Z. In the next section we prove a converse for the Gaussianchannel with feedback and show that we achieve capacity if we take Xto be Gaussian.

We now state an informal characterization of the capacity of the channelwith and without feedback.

1. With feedback . The capacity Cn,FB in bits per transmission of thetime-varying Gaussian channel with feedback is

Cn,FB = max1n tr(K(n)

X )≤P

12n

log|K(n)

X+Z||K(n)

Z |, (9.100)

Page 40: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Gaussian channels with feedback

9.6 GAUSSIAN CHANNELS WITH FEEDBACK 281

Zi

YiXi

FIGURE 9.6. Gaussian channel with feedback.

The feedback allows the input of the channel to depend on the past valuesof the output.

A (2nR, n) code for the Gaussian channel with feedback consists ofa sequence of mappings xi(W, Y i−1), where W ∈ {1, 2, . . . , 2nR} is theinput message and Y i−1 is the sequence of past values of the output. Thus,x(W, ·) is a code function rather than a codeword. In addition, we requirethat the code satisfy a power constraint,

E

[1n

n∑

i=1

x2i (w, Y i−1)

]

≤ P, w ∈ {1, 2, . . . , 2nR}, (9.99)

where the expectation is over all possible noise sequences.We characterize the capacity of the Gaussian channel is terms of the

covariance matrices of the input X and the noise Z. Because of the feed-back, Xn and Zn are not independent; Xi depends causally on the pastvalues of Z. In the next section we prove a converse for the Gaussianchannel with feedback and show that we achieve capacity if we take Xto be Gaussian.

We now state an informal characterization of the capacity of the channelwith and without feedback.

1. With feedback . The capacity Cn,FB in bits per transmission of thetime-varying Gaussian channel with feedback is

Cn,FB = max1n tr(K(n)

X )≤P

12n

log|K(n)

X+Z||K(n)

Z |, (9.100)

Page 41: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Natasha Devroye

Does feedback increase capacity?

9.6 GAUSSIAN CHANNELS WITH FEEDBACK 281

Zi

YiXi

FIGURE 9.6. Gaussian channel with feedback.

The feedback allows the input of the channel to depend on the past valuesof the output.

A (2nR, n) code for the Gaussian channel with feedback consists ofa sequence of mappings xi(W, Y i−1), where W ∈ {1, 2, . . . , 2nR} is theinput message and Y i−1 is the sequence of past values of the output. Thus,x(W, ·) is a code function rather than a codeword. In addition, we requirethat the code satisfy a power constraint,

E

[1n

n∑

i=1

x2i (w, Y i−1)

]

≤ P, w ∈ {1, 2, . . . , 2nR}, (9.99)

where the expectation is over all possible noise sequences.We characterize the capacity of the Gaussian channel is terms of the

covariance matrices of the input X and the noise Z. Because of the feed-back, Xn and Zn are not independent; Xi depends causally on the pastvalues of Z. In the next section we prove a converse for the Gaussianchannel with feedback and show that we achieve capacity if we take Xto be Gaussian.

We now state an informal characterization of the capacity of the channelwith and without feedback.

1. With feedback . The capacity Cn,FB in bits per transmission of thetime-varying Gaussian channel with feedback is

Cn,FB = max1n tr(K(n)

X )≤P

12n

log|K(n)

X+Z||K(n)

Z |, (9.100)

• In a discrete memoryless channel?

• In an additive white Gaussian noise channel?

• In a colored Gaussian noise channel?

Page 42: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Fall 2009, Natasha Devroye

SUMMARY 289

Thus, we have shown that Gaussian channel capacity is not increasedby more than half a bit or by more than a factor of 2 when we havefeedback; feedback helps, but not by much.

SUMMARY

Maximum entropy. maxEX2=α h(X) = 12 log 2πeα.

Gaussian channel. Yi = Xi + Zi; Zi ∼ N(0, N); power constraint1n

∑ni=1 x2

i ≤ P ; and

C = 12

log(

1 + P

N

)bits per transmission. (9.163)

Bandlimited additive white Gaussian noise channel. Bandwidth W ;two-sided power spectral density N0/2; signal power P ; and

C = W log(

1 + P

N0W

)bits per second. (9.164)

Water-filling (k parallel Gaussian channels). Yj = Xj + Zj, j = 1,

2, . . . , k; Zj ∼ N(0, Nj );∑k

j=1 X2j ≤ P ; and

C =k∑

i=1

12

log(

1 + (ν − Ni)+

Ni

), (9.165)

where ν is chosen so that∑

(ν − Ni)+ = nP .

Additive nonwhite Gaussian noise channel. Yi = Xi + Zi ; Zn ∼N(0, KZ); and

C = 1n

n∑

i=1

12

log(

1 + (ν − λi)+

λi

), (9.166)

where λ1, λ2, . . . , λn are the eigenvalues of KZ and ν is chosen so that∑i(ν − λi)

+ = P .

Capacity without feedback

Cn = maxtr(KX)≤nP

12n

log|KX + KZ|

|KZ|. (9.167)

Page 43: Chapter 9: Gaussian channelChapter 9 outline. • Definitions. • Capacity of Gaussian noise channels: achievability and converse. • Bandlimited channels. • Parallel Gaussian

University of Illinois at Chicago ECE 534, Fall 2009, Natasha Devroye

SUMMARY 289

Thus, we have shown that Gaussian channel capacity is not increasedby more than half a bit or by more than a factor of 2 when we havefeedback; feedback helps, but not by much.

SUMMARY

Maximum entropy. maxEX2=α h(X) = 12 log 2πeα.

Gaussian channel. Yi = Xi + Zi; Zi ∼ N(0, N); power constraint1n

∑ni=1 x2

i ≤ P ; and

C = 12

log(

1 + P

N

)bits per transmission. (9.163)

Bandlimited additive white Gaussian noise channel. Bandwidth W ;two-sided power spectral density N0/2; signal power P ; and

C = W log(

1 + P

N0W

)bits per second. (9.164)

Water-filling (k parallel Gaussian channels). Yj = Xj + Zj, j = 1,

2, . . . , k; Zj ∼ N(0, Nj );∑k

j=1 X2j ≤ P ; and

C =k∑

i=1

12

log(

1 + (ν − Ni)+

Ni

), (9.165)

where ν is chosen so that∑

(ν − Ni)+ = nP .

Additive nonwhite Gaussian noise channel. Yi = Xi + Zi ; Zn ∼N(0, KZ); and

C = 1n

n∑

i=1

12

log(

1 + (ν − λi)+

λi

), (9.166)

where λ1, λ2, . . . , λn are the eigenvalues of KZ and ν is chosen so that∑i(ν − λi)

+ = P .

Capacity without feedback

Cn = maxtr(KX)≤nP

12n

log|KX + KZ|

|KZ|. (9.167)

290 GAUSSIAN CHANNEL

Capacity with feedback

Cn,FB = maxtr(KX)≤nP

12n

log|KX+Z||KZ|

. (9.168)

Feedback bounds

Cn,FB ≤ Cn + 12. (9.169)

Cn,FB ≤ 2Cn. (9.170)

PROBLEMS

9.1 Channel with two independent looks at Y . Let Y1 and Y2 be condi-tionally independent and conditionally identically distributedgiven X.

(a) Show that I (X;Y1, Y2) = 2I (X;Y1) − I (Y1;Y2).

(b) Conclude that the capacity of the channel

X (Y1, Y2)

is less than twice the capacity of the channel

X Y1

9.2 Two-look Gaussian channel

X (Y1, Y2)

Consider the ordinary Gaussian channel with two correlated looksat X, that is, Y = (Y1, Y2), where

Y1 = X + Z1 (9.171)

Y2 = X + Z2 (9.172)


Recommended