Home > Documents > 02 Convolutional Codes

# 02 Convolutional Codes

Date post: 14-Apr-2018
Category:
Author: tin-nguyen
View: 232 times
Embed Size (px)

of 37

Transcript
• 7/30/2019 02 Convolutional Codes

1/37

1

Principles of Communications

By: Dang Quang Vinh

Faculty of Electronics and Telecommunications

Ho Chi Minh University of Natural Sciences

Convolutional codes

09/2008

• 7/30/2019 02 Convolutional Codes

2/37

2

Introduction

In block coding, the encoder accepts k-bit message block and generates n-bitcodewordBlock-by-block basis

Encoder must buffer an entire messageblock before generating the codeword

When the message bits come in serially

rather than in large blocks, using bufferis undesirable

Convolutional coding

• 7/30/2019 02 Convolutional Codes

3/37

3

Definitions

An convolutional encoder: a finite-state machine thatconsists of an M-stage shift register, n modulo-2 adders

L-bit message sequence produces an output sequencewith n(L+M) bits

Code rate:

L>>M, so

ol)(bits/symb)( MLn

Lr

ol)(bits/symb1

nr

• 7/30/2019 02 Convolutional Codes

4/37

4

Definitions

Constraint length (K): the number ofshifts over which a single message bit

influence the output

M-stage shift register: needs M+1 shiftsfor a message to enter the shift register

and come out K=M+1

• 7/30/2019 02 Convolutional Codes

5/37

5

Example Convolutional code (2,1,2)

n=2: 2 modulo-2 adders or 2 outputs

k=1: 1 input

M=2: 2 stages of shift register (K=M+1=2+1=3)

Path 1

Path 2

OutputInput

• 7/30/2019 02 Convolutional Codes

6/37

6

Example

OutputInput

Convolutional code (3,2,1)

n=3: 3 modulo-2 adders or 3 outputs

k=2: 2 input

M=1: 1 stages of each shift register (K=2 each)

• 7/30/2019 02 Convolutional Codes

7/37

7

Generations

Convolutional code is nonsystematiccode

Each path connecting the output to the input can becharacterized by impulse responseor generatorpolynomial

denoting the impulse response of theith path

Generator polynomial of the ith path:

D denotes the unit-delay variabledifferent from X ofcyclic codes

A complete convilutional code described by a set ofpolynomials { }

),,,...,()(

0

)(

1

)(

2

)( iiii

M gggg

)(

0

)(

1

2)(

2

)()( ...)( iiiMiMi gDgDgDgDg

)(),...,(),( )()2()1( DgDgDg n

• 7/30/2019 02 Convolutional Codes

8/37

8

Example(1/8)

Consider the case of (2,1,2)

Impulse response of path 1 is (1,1,1)

The corresponding generator polynomial is

Impulse response of path 2 is (1,0,1)

The corresponding generator polynomial is

Message sequence (11001)

Polynomial representation:

1)( 2)1( DDDg

1)(

2)2( DDg

1)( 34 DDDm

• 7/30/2019 02 Convolutional Codes

9/37

9

Example(2/8) Output polynomial of path 1:

Output sequence of path 1 (1001111)

Output polynomial of path 2:

Output sequence of path 2 (1111101)

1

1

)1)(1(

)()()(

236

2345456

234

)1()1(

DDDD

DDDDDDDD

DDDD

DgDmDc

1

)1)(1()()()(

23546

234

)2()2(

DDDDD

DDDDgDmDc

• 7/30/2019 02 Convolutional Codes

10/37

10

Example(3/8)

m= (11001)

c(1)=(1001111)

c(2)=(1111101) Encoded sequence c=(11,01,01,11,11,10,11)

Message length L=5bits

Output length n(L+K-1)=14bits A terminating sequence of K-1=2 zeros is

appended to the last input bit for the shiftregister to be restored to its zero initial state

• 7/30/2019 02 Convolutional Codes

11/37

11

Example(4/8)

Another way to calculate the output:

Path 1:

11110000

10110100

10011001

11001001

00110010

01100100

11001001

output111m

c(1)=(1001111)

• 7/30/2019 02 Convolutional Codes

12/37

12

Example(5/8)

Path 2m 101 output

001001 1 1

00100 11 1

0010 011 1

001 001 1 1

00 100 11 1

0 010 011 0

001 0011 1

c(2)=(1111101)

• 7/30/2019 02 Convolutional Codes

13/37

13

Example(6/8) Consider the case of (3,2,1)

denoting the impulseresponse of the jth path corresponding to ithinput

OutputInput

),,...,,( )( 0,)(

1,

)(

1,

)(

,

)( j

i

j

i

j

Mi

j

Mi

j

i ggggg

• 7/30/2019 02 Convolutional Codes

14/37

14

Example(7/8)

OutputInput

DDgg

DDgg

DDgg

Dgg

Dgg

DDgg

)()10(

1)()11(

)()10(

1)()01(

1)()01(

1)()11(

)1(1

)3(2

)1(

1

)3(

1

)2(

2

)2(

2

)2(

1

)2(

1

)1(

1

)1(

2

)1(

1

)1(

1

• 7/30/2019 02 Convolutional Codes

15/37

15

Example(8/8) Assume that:

m(1)=(101)m(1)(D)=D2+1

m(2)=(011)m(1)(D)=D+1

Outputs are:

c(1)

=m(1)

*g1(1)

+m(2)

*g2(1)

= (D2+1)(D+1)+(D+1)(1)

=D3+D2+D+1+D+1=D3+D2c(1)=(1100)

c(2)=m(1)*g1(2)+m(2)*g2

(2)

= (D2+1)(1)+(D+1)(D)

=D2+1+D2+D=D+1 c(2)=(0011)

c(3)=m(1)*g1(3)+m(2)*g2

(3)

= (D2+1)(D+1)+(D+1)(D)

=D3+D2+D+1+D2+D=1=D3+1c(3)=(1001)

Output c=(101,100,010,011)

• 7/30/2019 02 Convolutional Codes

16/37

16

State diagram

4 possible states Each node has 2 incoming

branches, 2 outgoing branches A transition from on state to

another in case of input 0 isrepresented by a solid line andof input 1 is represented bydashed line

Output is labeled over thetransition line

state Binarydescription

a 00

b 10

c 01

d 11

00

10 01

11

a

b

d

c

0/00

1/11

0/10

1/01

1/10

0/01

1/000/11

Consider convolutional code (2,1,2)

• 7/30/2019 02 Convolutional Codes

17/37

17

Example

Message 11001

Start at state a

Walk through the

state diagram inaccordance withmessage sequence

00

10 01

11

a

b

d

c

0/00

1/11

0/10

1/01

1/10

0/01

1/00

0/11

00

a

10 01 00

b c a

State

Input 1 0 0

Output

10

b

1

11

01

c

0

01

00

a

0

11 11 10 11

11

d

01

1

• 7/30/2019 02 Convolutional Codes

18/37

18

Trellis(1/2)

a=00

b=10

c=01

d=11

0/00 0/00 0/00

1/10

0/00

1/10

0/00

1/10

0/00

1/10

0/00 0/00

Level j=0 1 5432 L-1 L L+1 L+2

• 7/30/2019 02 Convolutional Codes

19/37

19

Trellis(1/2)

The trellis contains (L+K) levels

Labeled as j=0,1,,L,,L+K-1

The first (K-1) levels correspond to theencoders departure from the initial state a

The last (K-1) levels correspond to the

For the level j lies in the range K-1jL, all

the states are reachable

• 7/30/2019 02 Convolutional Codes

20/37

20

Example

Message 11001

a=00

b=10

c=01

d=11

0/00 0/00 0/00

1/10

0/00

1/10

0/00

1/10

0/00 0/00

Level j=0 1 5432

Input

Output

1 1 0 0 1 0 0

6 7

11 01 01 11 11 10 11

Ma im m Likelihood Decoding

• 7/30/2019 02 Convolutional Codes

21/37

21

Maximum Likelihood Decodingof Convolutional codes

mdenotes a message vector

cdenotes the corresponding code vector

With a given r, decoder is required to make estimate

of message vector, equivalently produce an estimateof the code vector

otherwise, a decoding errorhappens

Decoding ruleis said to be optimum when thepropability ofdecoding erroris minimized

The maximum likelihood decoder or decision rule isdescribed as follows:

Choose the estimate for which the log-likelihoodfunction logp(r/c)is maximum

m

c

ccmm ifonly

c

Maximum Likelihood Decoding

• 7/30/2019 02 Convolutional Codes

22/37

22

Maximum Likelihood Decodingof Convolutional codes Binary symmetric channel: both c and r are binary

sequences of length N

rdiffers from cin dpositions, or dis the Hammingdistance between rand c

N

i

ii crpcrp1

)|()|(

N

i ii

crpcrp1

)|(log)|(log

ii

ii

iicr

cr

p

pcrp

if

if

1)|(with

)1log(1log

)1log()(log)|(log

pNp

pd

pdNpdcrp

Maximum Likelihood Decoding

• 7/30/2019 02 Convolutional Codes

23/37

23

Maximum Likelihood Decodingof Convolutional codes

Decoding rule is restated as follows: Choose the estimate that minimizes the

Hamming distance between the received vector rand the transmitted vector c

The received vector ris compared with eachpossible code vector c, and the one closestto ris chosen as the correct transmittedcode vector

c

• 7/30/2019 02 Convolutional Codes

24/37

24

The Viterbi algorithm

Choose a path in the trellis whosecoded sequence differs from the

received sequence in the fewestnumber of positions

• 7/30/2019 02 Convolutional Codes

25/37

25

The Viterbi algorithm

The algorithm operates by computing ametric for every possible path in the trellis

Metric is Hamming distance between codedsequence represented by that path andreceived sequence

For each node, two paths enter the node, thelower metric is survived. The other is

discarded Computation is repeated every level j in the

range K-1 jL Number of survivors at each level 2K-1=4

• 7/30/2019 02 Convolutional Codes

26/37

26

The Viterbi algorithm

c=(11,01,01,11,11,10,11),r=(11,00,01,11,10,10,11)

a=00

b=10

c=01

d=11

0/00 0/00 0/00

1/10

0/00

1/10

0/00

1/10

0/00 0/00

Input 11 00 01 11 10 10 11

11 01 01 11 11 10 11

2

0

2

4

1

1

3 2

3 2

61

4 3

41

32

4 3

4 3

24

24

2 5

43

3

25

2

5

CodeOutput 1 1 0 0 1 0 0

• 7/30/2019 02 Convolutional Codes

27/37

27

Free distance of a conv. code

Performance of a conv. code depends ondecoding algorithm and distance properties ofthe code.

Free distance, denoted by dfree

, is a measure ofcode s ability to combat channel noise

Free distance: minimum Hamming distancebetween any two codewords in the code

dfree>2t Since a convolutional code doesn't use blocks,

processing instead a continuous bitstream, thevalue oftapplies to a quantity of errors located

relatively near to each other

• 7/30/2019 02 Convolutional Codes

28/37

28

Free distance of a conv. code

Conv. code has linear property So, free distance also defined:

Calculate dfree by a generating function

Generating function viewed the transferfunction of encoder

Relating input and output by convolution Generation func relating initial and final state

by multiplication Free distance

Decoding error probability

....000)]([ min

XXwdfree

• 7/30/2019 02 Convolutional Codes

29/37

29

Free distance of a conv. code

Modify state diagram

00

10 01

11

a

b

d

c

0/00

1/11

0/10

1/01

1/10

0/01

1/00

0/11

d

a0 b c a1

DL

D2L

DL D

D

L

D2

Signal-flow graph

Exponent of D: Hamming weight of encoderoutput on that branch.

Exponent of L: number of nonzero message

bits

• 7/30/2019 02 Convolutional Codes

30/37

30

Free distance of a conv. code

State equations:

cDa

DLdDLbd

DdDbc

2

1

0

2

d

a0 b c a1

DL

D2L

DL D

D

L

D2

a0,b,c,d,a1: node signals of the graph

Solve the equation set for a1/a0. Sothe generating func is:

0

55

0

1 )2(21

),(i

iDLLDDL

LD

a

aLDT

4

5

537265

2...42),(

d

d

dd

LDLDLDLDLDT

• 7/30/2019 02 Convolutional Codes

31/37

31

Free distance of a conv. code

T(D,L) represents all possible transmittedsequences that terminate with c-etransition

For any d5, there are 2d-5 paths withweight w(X)=dthat terminate with c-e

transition, those paths are generated bymessages containing d-4 nonzero bits

The free distance is the smallest of w(X),so d

free

=5

4

5

537265 2...42),(

dd

dd LDLDLDLDLDT

• 7/30/2019 02 Convolutional Codes

32/37

32

Systematic conv. code

The message elements appear explicitly in theoutput sequence together with the redundantelements

Path 1

Path 2

Output

Input

• 7/30/2019 02 Convolutional Codes

33/37

33

Systematic conv. code

Impulse response of path 1 is (1,0,0)

The corresponding generator polynomial is

Impulse response of path 2 is (1,0,1)

The corresponding generator polynomial is

Message sequence (11001)

2)1(

)( DDg

1)(

2)2( DDg

• 7/30/2019 02 Convolutional Codes

34/37

34

Systematic conv. code

Output sequence of path 1 (1100100)

Output sequence of path 2 (1111101)

m= (11001) c(1)=(1100100)

c(2)=(1111101)

Encoded sequencec=(11,11,01,01,11,00,01)

• 7/30/2019 02 Convolutional Codes

35/37

35

Systematic conv. code

Another example of systemmatic conv.code

Path 1

Path 2

Output

Input

• 7/30/2019 02 Convolutional Codes

36/37

36

Systematic vs nonsystematic

Assumption: T(D,L) is convergent

When T(D,L) is nonconvergent, an finite numberof transmission errors can cause an infinitenumber of decoding errors

The code is called catastrophic code

Systematic conv. code cannot be catastrophic But, for the same constraint length, free distance

of systematic code is smaller than that ofnonsystematic code

Table 10.8

4

5

537265 2...42),(

dd

dd LDLDLDLDLDT

• 7/30/2019 02 Convolutional Codes

37/37

37

Systematic vs nonsystematic

Maximum free distance with systematic andnonsystematic conv codes of rate 1/2

K Systematic Nonsystematic

2 3 33 4 5

4 4 6

5 5 7

6 6 87 6 10

8 7 10

Recommended