+ All Categories
Home > Documents > Postacademic Course on Telecommunications 20/4/00 p. 1 Module-3 Transmission Marc Moonen Lecture-2...

Postacademic Course on Telecommunications 20/4/00 p. 1 Module-3 Transmission Marc Moonen Lecture-2...

Date post: 28-Dec-2015
Category:
Upload: alfred-hardy
View: 216 times
Download: 0 times
Share this document with a friend
39
Postacademic Course on Telecommunications Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA 20/4/00 p. 1 Lecture-2: Limits of Communication Problem Statement: Given a communication channel (bandwidth B), and an amount of transmit power, what is the maximum achievable transmission bit-rate (bits/sec), for which the bit- error-rate is (can be) sufficiently (infinitely) small ? - Shannon theory (1948) - Recent topic: MIMO-transmission (e.g. V-BLAST 1998, see also Lecture- 1)
Transcript

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 1

Lecture-2: Limits of Communication

• Problem Statement:

Given a communication channel (bandwidth B), and an amount of transmit power, what is the maximum achievable transmission bit-rate (bits/sec), for which the bit-error-rate is (can be) sufficiently (infinitely) small ?

- Shannon theory (1948)

- Recent topic: MIMO-transmission

(e.g. V-BLAST 1998, see also Lecture-1)

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 2

Overview

• `Just enough information about entropy’

(Lee & Messerschmitt 1994)

self-information, entropy, mutual information,…

• Channel Capacity (frequency-flat channel)• Channel Capacity (frequency-selective channel) example: multicarrier transmission

• MIMO Channel Capacity example: wireless MIMO

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 3

`Just enough information about entropy’(I)

• Consider a random variable X with sample space (`alphabet’)

• Self-information in an outcome is defined as

where is probability for (Hartley 1928)

• `rare events (low probability) carry more information than common events’

`self-information is the amount of uncertainty removed after observing .’

M ,...,,, 321

K)(log)( 2 KXK ph

K)( KXp

K

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 4

`Just enough information about entropy’(II)

• Consider a random variable X with sample space (`alphabet’)

• Average information or entropy in X is defined as

because of the log, information is measured in bits

M ,...,,, 321

K

KXKX ppXH

)(log).()( 2

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 5

`Just enough information about entropy’ (III)

• Example: sample space (`alphabet’) is {0,1} with

entropy=1 bit if q=1/2 (`equiprobable symbols’)

entropy=0 bit if q=0 or q=1 (`no info in certain events’)

qpqp XX 1)0(,)1(

q

H(X)

1

10

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 6

`Just enough information about entropy’ (IV)

• `Bits’ being a measure for entropy is slightly confusing (e.g. H(X)=0.456 bits??), but the theory leads to results, agreeing with our intuition (and with a `bit’ again being something that is either a `0’ or a `1’), and a spectacular theorem

• Example:

alphabet with M=2^n equiprobable symbols :

-> entropy = n bits

i.e. every symbol carries n bits

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 7

`Just enough information about entropy’ (V)

• Consider a second random variable Y with sample space (`alphabet’)

• Y is viewed as a `channel output’, when X is

the `channel input’.• Observing Y, tells something about X:

is the probability for after observing

N ,...,,, 321

),(| KKYXp K

K

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 8

`Just enough information about entropy’ (VI)

• Example-1 :

• Example-2 : (infinitely large alphabet size for Y)

+

noise decisiondevice

X Y00011011

+

noise

X Y

00011011

00011011

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 9

`Just enough information about entropy’(VII)

• Average-information or entropy in X is defined as

• Conditional entropy in X is defined as

Conditional entropy is a measure of the average uncertainty about the channel input X after observing the output Y

K

KXKX ppXH

)(log).()( 2

K K

KKYXKKYXKY pppYXH

)|(log).|()()|( |2|

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 10

`Just enough information about entropy’(VIII)

• Average information or entropy in X is defined as

• Conditional entropy in X is defined as

• Average mutual information is defined as

I(X|Y) is uncertainty about X that is removed by observing Y

K

KXKX ppXH

)(log).()( 2

K K

KKYXKKYXKY pppYXH

)|(log).|()()|( |2|

)|()()|( YXHXHYXI

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 11

Channel Capacity (I)

• Average mutual information is defined by -the channel, i.e. transition probabilities

-but also by the input probabilities

• Channel capacity (`per symbol’ or `per channel use’) is defined as the maximum I(X|Y) for all possible choices of

• A remarkably simple result: For a real-valued additive Gaussian noise channel, and infinitely large alphabet for X (and Y), channel capacity is

),(| KKYXp )( KXp

)( KXp

)1(log.2

12

2

2n

x

sign

al (

nois

e) v

aria

nces

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 12

Channel Capacity (II)

• A remarkable theorem (Shannon 1948):

With R channel uses per second, and channel capacity C, a bit stream with bit-rate C*R (=capacity in bits/sec) can be transmitted with arbitrarily low probability of error

= Upper bound for system performance !

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 13

Channel Capacity (II)

• For a real-valued additive Gaussian noise channel, and infinitely large alphabet for X (and Y), the channel capacity is

• For a complex-valued additive Gaussian noise channel, and infinitely large alphabet for X (and Y), the channel capacity is

)1(log

2

2

2n

x

)1(log.2

12

2

2n

x

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 14

Channel Capacity (III)

Information I(X|Y) conveyed by a real-valued channel with additive white Gaussian noise, for different input alphabets, with all symbols in the alphabet equally likely

(Ungerboeck 1982)

2

2

n

xSNR

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 15

Channel Capacity (IV)

Information I(X|Y) conveyed by a complex-valued channel with additive white Gaussian noise, for different input alphabets, with all symbols in the alphabet equally likely (Ungerboeck 1982)

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 16

Channel Capacity (V)

This shows that, as long as the alphabet is sufficiently large, there is no significant loss in capacity by choosing a discrete input alphabet, hence justifies the usage of such alphabets !

The higher the SNR, the larger the required alphabet to approximate channel capacity

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 17

Channel Capacity (frequency-flat channels)

• Up till now we considered capacity `per symbol’ or `per channel use’

• A continuous-time channel with bandwidth B (Hz) allows 2B (per second) channel uses (*), i.e. 2B symbols being transmitted per second, hence capacity is

(*) This is Nyquist criterion `upside-down’ (see also Lecture-3)

second

bits)1(log

2

1.2

2

2

2n

xB

received signal (noise) power

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 18

Channel Capacity (frequency-flat channels)

• Example: AWGN baseband channel (additive white Gaussian noise channel)

n(t)

+

channel

s(t) r(t)=Ho.s(t)+n(t)

Ho

f

H(f)

B-B

Ho

second

bits)1(log.

2

220

2n

xHB

here BN

BEs

n

x

.

2.

02

2

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 19

Channel Capacity (frequency-flat channels)

• Example: AWGN passband channel passband channel with bandwidth B accommodates

complex baseband signal with bandwidth B/2 (see Lecture-3)

n(t)

+

channel

s(t) r(t)=Ho.s(t)+n(t)

Ho

f

H(f)

x

Ho

second

bits)1(log

2.2

2

220

2n

xHB

x+B

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 20

Channel Capacity (frequency-selective channels)

n(t)

+

channel

s(t) R(f)=H(f).S(f)+N(f)

H(f)

• Example: frequency-selective AWGN-channel

received SNR is frequency-dependent!

f

H(f)

B-B

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 21

Channel Capacity (frequency-selective channels)

• Divide bandwidth into small bins of width df, such that H(f) is approx. constant over df

• Capacity is

optimal transmit power spectrum?

f

H(f)

B-B

second

bits).

)(

)()(1(log

2

22

2 dff

ffH

n

x

0

B

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 22

Channel Capacity (frequency-selective channels)

Maximize

subject to

solution is

`Water-pouring spectrum’

dff

ffH

n

x ).)(

)()(1(log

2

22

2

Available Power dffxx ).(22

))(

)(,0max()(

2

22

fH

fLf n

x

B

L

)(

)(2

2

fH

fn

area 2x

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 23

Channel Capacity (frequency-selective channels)

Example : multicarrier modulation available bandwidth is split up into different `tones’, every

tone has a QAM-modulated carrier (modulation/demodulation by means of IFFT/FFT).

In ADSL, e.g., every tone is given (+/-) the same power, such that an upper bound for capacity is (white noise case)

(see Lecture-7/8)

second

bits).).(1(log

2

22

2 dffHn

x

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 24

MIMO Channel Capacity (I)

• SISO =`single-input/single output’• MIMO=`multiple-inputs/multiple-outputs’• Question:

we usually think of channels with one transmitter

and one receiver. Could there be any advantage

in using multiple transmitters and/or receivers

(e.g. multiple transmit/receive antennas in a

wireless setting) ???• Answer: You bet..

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 25

MIMO Channel Capacity (II)

• 2-input/2-output example

A

B

C

D

+

+

X1

X2 Y2

Y1

N1

N2

2

1

2

1.

2

1

N

N

X

X

DC

BA

Y

Y

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 26

MIMO Channel Capacity (III)

Rules of the game:• P transmitters means that the same total power

is distributed over the available transmitters (no cheating)

• Q receivers means every receive signal is corrupted by the same amount of noise (no cheating)

Noises on different receivers are often assumed to be uncorrelated (`spatially white’), for simplicity

...)()( 222

21 dffdff XXX

...222

21 NNN

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 27

MIMO Channel Capacity (IV)

2-in/2-out example, frequency-flat channels

Ho

0

0

Ho

+

+

X1

X2 Y2

Y1

N1

N2

2

1

2

1.

0

0

2

1

N

N

X

X

Ho

Ho

Y

Y

first example/attempt

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 28

MIMO Channel Capacity (V)

2-in/2-out example, frequency-flat channels

• corresponds to two separate channels, each with input power and additive noise

• total capacity is

• room for improvement...

2

1

2

1.

0

0

2

1

N

N

X

X

Ho

Ho

Y

Y

2

2X 2

N

second

bits)

.21(log..2

2

220

2n

xHB

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 29

MIMO Channel Capacity (VI)

2-in/2-out example, frequency-flat channels

Ho

Ho

-Ho

Ho

+

+

X1

X2 Y2

Y1

N1

N2

2

1

2

1.

2

1

N

N

X

X

HoHo

HoHo

Y

Y

second example/attempt

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 30

MIMO Channel Capacity (VII)

A little linear algebra…..

2

1

2

1.

2

1

N

N

X

X

HoHo

HoHo

Y

Y

2

1

2

1.

2

1

2

12

1

2

1

..20

0.2

N

N

X

X

Ho

Ho

Matrix V’

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 31

MIMO Channel Capacity (VIII)

A little linear algebra…. (continued)

• Matrix V is `orthogonal’ (V’.V=I) which means that it represents a transformation that conserves energy/power

• Use as a transmitter pre-transformation

• then (use V’.V=I) ...

2

1

2

1'..

.20

0.2

2

1

N

N

X

XV

Ho

Ho

Y

Y

2ˆ1ˆ

.2

1

X

XV

X

X

Dig

up

your

line

ar a

lgeb

ra c

ours

e no

tes.

..

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 32

MIMO Channel Capacity (IX)

• Then…

+

+ Y2

Y1

N1

N2

+

+

X^1

X^2

X2

X1

transmitter

A

B

C

D

V11

V12

V21

V22

channel receiver

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 33

MIMO Channel Capacity (X)

• corresponds to two separate channels, each with input power , output power

and additive noise • total capacity is

2

2X

2N

second

bits)1(log..2

2

220

2n

xHB

2

1

2ˆ1ˆ

..20

0.2

2

1

N

N

X

X

Ho

Ho

Y

Y

2).2(

22 XHo

2x SISO-capacity!

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 34

MIMO Channel Capacity (XI)

• Conclusion: in general, with P transmitters and P receivers, capacity can be increased with a factor up to P (!)

• But: have to be `lucky’ with the channel (cfr. the two `attempts/examples’)

• Example : V-BLAST (Lucent 1998)

up to 40 bits/sec/Hz in a `rich scattering environment’ (reflectors, …)

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 35

MIMO Channel Capacity (XII)

• General I/O-model is :

• every H may be decomposed into

this is called a `singular value decompostion’, and works for every matrix (check your MatLab manuals)

P

QxP

Q X

X

H

Y

Y

:.:11

'.. VSUH

diagonal matrix

orthogonal matrix V’.V=Iorthogonal matrix U’.U=I

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 36

MIMO Channel Capacity (XIII)

With H=U.S.V’, • V is used as transmitter pre-tranformation

(preserves transmit energy) and• U’ is used as a receiver transformation

(preserves noise energy on every channel)• S=diagonal matrix, represents resulting,

effectively `decoupled’ (SISO) channels• Overall capacity is sum of SISO-capacities• Power allocation over SISO-channels (and as a

function of frequency) : water pouring

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 37

MIMO Channel Capacity (XIV)

Reference:

G.G. Rayleigh & J.M. Cioffi

`Spatio-temporal coding for wireless communications’

IEEE Trans. On Communications, March 1998

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 38

Assignment 1 (I)

• 1. Self-study material

Dig up your favorite (?) signal processing textbook & refresh your knowledge on

-discrete-time & continuous time signals & systems

-signal transforms (s- and z-transforms, Fourier)

-convolution, correlation

-digital filters

...will need this in next lectures

Postacademic Course on Telecommunications

Module-3 Transmission Marc MoonenLecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA

20/4/00p. 39

Assignment 1 (II)

• 2. Exercise (MIMO channel capacity)

Investigate channel capacity for…

-SIMO-system with 1 transmitter, Q receivers

-MISO-system with P transmitters, 1 receiver

-MIMO-system with P transmitters, Q receivers

P=Q (see Lecture 2)

P>Q

P<Q


Recommended