+ All Categories
Home > Documents > Lecture 25: Thu April 12, 2018 -...

Lecture 25: Thu April 12, 2018 -...

Date post: 07-Sep-2018
Category:
Upload: lamnhan
View: 214 times
Download: 0 times
Share this document with a friend
37
565 Lecture 25: Thu April 12, 2018 Reminders: • HW8 posted • Quiz 3: one week from today Lecture • Hamming codes with r circles • Hat color problem • Intro to convolutional codes
Transcript

565

Lecture 25: Thu April 12, 2018

Reminders:

• HW8 posted• Quiz 3: one week from today

Lecture

• Hamming codes with r circles• Hat color problem• Intro to convolutional codes

566

The (32, 6)Reed-Muller Code

00000000000000000000000000000000010101010101010101010101010101010011001100110011001100110011001101100110011001100110011001100110000011110000111100001111000011110101101001011010010110100101101000111100001111000011110000111100011010010110100101101001011010010000000011111111000000001111111101010101101010100101010110101010001100111100110000110011110011000110011010011001011001101001100100001111111100000000111111110000010110101010010101011010101001010011110011000011001111001100001101101001100101100110100110010110000000000000000011111111111111110101010101010101101010101010101000110011001100111100110011001100011001100110011010011001100110010000111100001111111100001111000001011010010110101010010110100101001111000011110011000011110000110110100101101001100101101001011000000000111111111111111100000000010101011010101010101010010101010011001111001100110011000011001101100110100110011001100101100110000011111111000011110000000011110101101010100101101001010101101000111100110000111100001100111100011010011001011010010110011010011111111111111111111111111111111110101010101010101010101010101010110011001100110011001100110011001001100110011001100110011001100111110000111100001111000011110000101001011010010110100101101001011100001111000011110000111100001110010110100101101001011010010110111111110000000011111111000000001010101001010101101010100101010111001100001100111100110000110011100110010110011010011001011001101111000000001111111100000000111110100101010110101010010101011010110000110011110011000011001111001001011001101001100101100110100111111111111111110000000000000000101010101010101001010101010101011100110011001100001100110011001110011001100110010110011001100110111100001111000000001111000011111010010110100101010110100101101011000011110000110011110000111100100101101001011001101001011010011111111100000000000000001111111110101010010101010101010110101010110011000011001100110011110011001001100101100110011001101001100111110000000011110000111111110000101001010101101001011010101001011100001100111100001111001100001110010110011010010110100110010110

567

Power vs. Bandwidth

0 0.56

8

10

12

14

W/Rb

Eb/N

0 (

dB

)

CODED

UNCODED2-PAM

≈ g

= Rd

min

H

0.5/R

g =

RdH =

3 =

4.8

dB

(act

ual g

ain

is 3

.7 d

B,

beca

use K

= 3

1)

568

Universal Equations for Coded 2-PAM

In terms of code “rate” R = :

M = 2k, N = n ⇒

kn---

log2MN/2

---------------- 2R.=Rb/W =

12--- Q 1– Pb

K-------

2

g-----------------------------------Eb/N0 =

10.53 dB when K = 1

g dmin2

4Eb------------- 4Egd

Hmin

4nEg/k------------------------ Rd

Hmin .= = =“coding gain”

569

3 Circles: The (7, 4) Hamming Code

c = [m0 m1 m2 m3 p0 p1 p2]

original parity bitsmessage

m0

m1

m2

m3

p0

p1p2

570

The (7, 4) Hamming Code0 0 0 0 0 0 00 0 0 1 0 1 10 0 1 0 1 1 00 0 1 1 1 0 10 1 0 0 1 1 10 1 0 1 1 0 00 1 1 0 0 0 10 1 1 1 0 1 01 0 0 0 1 0 11 0 0 1 1 1 01 0 1 0 0 1 11 0 1 1 0 0 01 1 0 0 0 1 01 1 0 1 0 0 11 1 1 0 1 0 01 1 1 1 1 1 1

571

UBA Analysis of (7, 4) HammingdH,min = 3 ⇒ g = Rdmin

H = (4/7)(3) = = 2.34 dB115------

-5 0 5 1010–5

10–4

10–3

10–2

10–1

1B

IT-E

RR

OR

PR

OB

AB

ILIT

Y

Eb/N0 (dB)

2-PAM

1.86 dB

(7, 4) HAMMING

2.34 dB

572

The (15, 11) Binary Hamming CodeConstruction using the Venn diagram:

• draw r = 4 “circles,” partially overlapping,

• Write one message bit into each of the 2r – 1 – r = 11 regions involving more than one circle

• Write parity bits into remainingr = 4 regions, chosen so that parity in each circle is even

A

B

K

C

F

D

E

H I

G

Jp2

p0

p1

p3

c = [m0 m1 m2 m3 m9 m10 p0 p1 p2 p3]

creating 2r – 1 = 15 regions:

original parity bitsmessage

573

Point in Plane for (15, 11) HammingdH,min = 3 for any Hamming Code

⇒ coding gain of (15, 11) Hamming code ≈ RdminH = 3 = = 3.4 dB

⇒ Eb/N0 ≈ 10.53 dB – 3.4 dB = 7.1 dB

Remark: Because K = (1/5)(35) = 7 is big, the actual coding gain is only 2.8 dB.

bandwidth expansion (again relative to 2-PAM) is

⇒ W/Rb = 1/(2R) = 15/22 ≈ 0.68

1115------ 11

5------

1511------

574

Performance (15, 11) Hamming

0 2 4 6 8 10 1210–8

10–6

10–4

10–2

Eb/N0 (dB)

BIT

-ER

RO

R P

RO

BA

BIL

ITY

2.8 dB

UNCODED 2-PAM

(UBA)

CODING GAIN

R dmin H 2Eb/N0Pb

UBA ≈ KQ( );

= 1/5, K = 35, R = 11/15, dminH = 3

2Eb/N0Pb = Q( )

575

Power vs Bandwidth

576

The Hat Color GameGuess your own hat color? Team of n wins a $1B prize when both:

• at least one correct guess,• no wrong guesses.

577

The Hat Color GameGuess your own hat color? Team of n wins a $1B prize when both:

• at least one correct guess,• no wrong guesses.

passpass RED

passpass passpass

578

The Hat Color GameGuess your own hat color? Team of n wins a $1B prize when both:

• at least one correct guess,• no wrong guesses.

passpass RED

passpass passpass

• Assign hat color according to coin flips• No communication after hats are assigned• All players simultaneously either (i) guess or (ii) pass

Rules

579

The Hat Color GameGuess your own hat color? Team of n wins a $1B prize when both:

• at least one correct guess,• no wrong guesses.

The Question: What strategy maximizes the chances of winning?

passpass RED

passpass passpass

• Assign hat color according to coin flips• No communication after hats are assigned• All players simultaneously either (i) guess or (ii) pass

Rules

580

Special Case: n = 3

581

Special Case: n = 3Observation:

• 25% = 2/8 the hat colors are identical

• 75% = 6/8 the hat colors are mixed

582

Special Case: n = 3Observation:

• 25% = 2/8 the hat colors are identical

• 75% = 6/8 the hat colors are mixed

Here’s a strategy that guarantees a WIN whenever the hat colors are mixed:

⇒ win

583

Special Case: n = 3Observation:

• 25% = 2/8 the hat colors are identical

• 75% = 6/8 the hat colors are mixed

Here’s a strategy that guarantees a WIN whenever the hat colors are mixed:

Each player uses a local rule, asking self:

“Do I see 2 hats of the same color?”

• No ⇒ pass.

• Yes ⇒ guess the opposite.

Wins with probability 75%.

RRR ⇒ lose

--R-R-B--R---B---BBBB ⇒ lose

⇒ win

584

Solution to Hat Problem(When number of players is n = 2r – 1).

Think of Hamming code based on r circles.

View hat assignment as a vector x = [x1 xn], using Red = 1, Blue = 0

Because of “perfect” property of Hamming codes, x must be either

• a valid codeword, with low probability 1/(n+1); or

• one flip away from a valid codeword, with high probability n/(n+1) .

585

Winningest Strategy(Bank on not being a valid codeword)

Each player uses a local rule, asking self:

“Can I complete a valid codword?”

• No ⇒ pass.

• Yes ⇒ guess the opposite.

Wins whenever X is not a codeword:

Prob[Win] = n/(n+1)

RED

586

Local Rule:“Can I complete a valid codword?”

If codeword happens to be valid, everyone will answer “Yes” ...... everyone guesses wrong! (Catastrophic.)

If codeword is not valid, then it must be one bit flip away from one.

The one person standing in the error position will answer “Yes”.

All of the others will answer “No” ⇒ pass.

587

Example: Put a Point in the Plane for

the (23, 12) Binary Golay Code + 2PAM Its minimum Hamming distance is dmin

H = 7.

588

Convolutional Codes• Rate-k/n convolutional code

A k-input n-output filter using mod-2 arithmetic Usually linear and time-invariant

Example: a rate-1/2 convolutional encoder:

mk mk–1 mk–2

ck(2 )

MESSAGE

ck(1 )

BITSCODED BITS

589

Convolutional Codes• Rate-k/n convolutional code

A k-input n-output filter using mod-2 arithmetic Usually linear and time-invariant

Example: a rate-1/2 convolutional encoder:

mk mk–1 mk–2

ck(2 )

MESSAGE

ck(1 )

BITSCODED BITS

message = [10010]Example:

⇒ codeword = ?

590

Convolutional Codes• Rate-k/n convolutional code

A k-input n-output filter using mod-2 arithmetic Usually linear and time-invariant

Example: a rate-1/2 convolutional encoder:

mk mk–1 mk–2

ck(2 )

MESSAGE

ck(1 )

BITSCODED BITS

message = [10010]codeword1 = [1011010]codeword2 = [1111110] ⇒ codeword = [11 01 11 11 01 11 00]

Example:

591

Point in the Plane?

• Linear? If yes, use zero message as reference• What is response to a message having only a single 1 bit?• What is dmin

H ?

mk mk–1 mk–2

ck(2 )

MESSAGE

ck(1 )

BITSCODED BITS

592

Good Codes Tabulated

593

A Good Rate-1/3 EncoderFrom table: g0(D ) = 138 = 001 011 = 1 + D + D3

g1(D ) = 158 = 001 101 = 1 + D2 + D3

g2(D ) = 178 = 001 111 = 1 + D + D2 + D3

ml mk–2 mk–3

ck( 2 )

ck( 1 )

mk mk–1

ck( 3 )

594

Point in Plane?

0 1 2 3 46

8

10

12

14

W/Rb

Eb/N

0 (

dB

)

SH

AN

NO

N

2-PAM

595

Point in Plane?

0 1 2 3 46

8

10

12

14

W/Rb

Eb/N

0 (

dB

)

SH

AN

NO

N

2-PAM

596

Properties of a general FSM• finite #states• the next state k + 1 depends only on current l and current input ml

We also have an output associated with each state transition:

ck = f(k, k + 1)

mk

k

mk

k

ck

597

Finite State Machine

mkmk–1 mk–2

ck( 2 )

MESSAGE

ck( 1 )

598

State transition diagram

mk mk–1mk–2

ck(2 )

ck(1 )

0100

10

01

1100

11 10

1011

k = [mk–1, mk–2]

01

599

Repeat Example, without Convolution

mkmk–1 mk–2

ck(2 )

MESSAGE

ck(1 )

BITSCODED BITS

message = [10010]codeword1 = [1011010]codeword2 = [1111110] ⇒ codeword = [11 01 11 11 01 11 00]

Example:

600

A rate-1/2 convolution encoder:

mk mk–1 mk–2

ck(2 )

ck(1 )

L messsage bits [m0, m1, ..., mL – 1 ]

coded bits

nominal

1 10 102 103

0.2

0.3

0.4

0.5

0.1

0

R

L

601

Two Views of a Convolutional Encoder• a pair of LTI filters that convolve message bits with impulse responses

• a finite-state machine


Recommended