Convolutional Codes R-J Chen. p2. OUTLINE [1] Shift registers and polynomials [2] Encoding...

Post on 19-Jan-2016

224 views 0 download

Tags:

transcript

Convolutional Codes

R-J Chen

p2.

OUTLINE

[1] Shift registers and polynomials

[2] Encoding convolutional codes

[3] Decoding convolutional codes

[4] Truncated Viterbi decoding

p3.

Convolutional Code

[1] Shift registers and polynomials XOR :

register and clock

X0

X2

Output

0 00 11 01 1

X1 X2 Output= X0+X1+X2

0

110

Xi }1,0{KX i

clock

register

t

001 1

t-1 t+1

X1

0000

X0

0 00 11 01 1

1

001

1111

10 0

p4.

Convolutional Code

Shift register ( SR )

input

X0+X1+X3=output

Figure 8.1 : A shift register

clock

X0 X1 X2 X3

0

1+1+1=1

X0 X1 X2 X3

1 1 0 1

t-1 t

0+1+0=1

X0 X1 X2 X3

0 1 1 0

Example 8.1.1 :

p5.

Convolutional Code

time input

input

X0+X1+X3=output

X0 X1 X2 X3

-1 0 0 0 0

output=X0+X1+X3

0 1 11 0 0 0

1 0 10 1 0 0

2 1 11 0 1 0

3 0 00 1 0 1

4 0 00 0 1 0

5 0 10 0 0 1

6 0 00 0 0 0

Input : 1010000 Initial value : 0000

=1+0+0

Example 8.1.2 :

000101=

=1110010

p6.

Convolutional Code

s-stage shift register : shift register with s registers Xi(t) : the value of the contents of register Xi at time t.

ct : the output of time t.

ct=g0X0(t)+g1X1(t)+……..+gs-1Xs-1(t), }1,0{ig

input

ct

X0 X1 Xs-1

gs-1g1g0

Example : 3-stage shift register at time t

input

X0 X1 X2

1

001

g0=1 , g1=0, g2=1

X0(t)=1 , X1(t)=0, X2(t)=0

ct=1‧1+0‧0+1‧0 = 1

p7.

Convolutional Code

Generator of shift register g(x) : generator of s-stage shift register, g(x)=g0+g1x+….

+gs-1xs-1 ,

where and are the coefficients of the shift register.

}1,0{ig

input

ct

X0 X1 Xs-1

gs-1g1g0

x0 x1 xs-1Poly. terms :

Example : 4-stage shift register at time t

g(x)g0=1 , g1=1, g2=0, g3=1

g = 1101

g = g0g1 … gs-1

g(x)=g0+g1x+…+gs-1xs-1

g(x) = 1+x1+0‧x2+x3 input

p8.

Convolutional Code

Example :

X0 X1 X2 X3

11010=c1

g=1101

Input : a=10000 a(x)=1

c1(x)=a(x)‧g(x)

=1‧g(x)

Generator : g=1101 g(x) = 1+x1+x3

Output : c=11010

00001

c(x) = 1+x1+x3

time input X0 X1 X2 X3

-1 0 0 0 0

0 1 1 0 0 0

1 0 0 1 0 0

2 0 0 0 1 0

3 0 0 0 0 1

4 0 0 0 0 0

Output

1

1

0

1

0

p9.

Convolutional Code

Example :

X0 X1 X2 X3

01101=c2

g=1101

Input : a=01000 a(x)=x1

c2(x)=a(x)‧g(x) =x‧g(x)

Generator : g=1101 g(x) = 1+x1+x3

Output : c=01101

00010

c(x) = x+x2+x4

time input X0 X1 X2 X3

-1 0 0 0 0

0 0 0 0 0 0

1 1 1 0 0 0

2 0 0 1 0 0

3 0 0 0 1 0

4 0 0 0 0 1

Output

0

1

1

0

1

p10.

Convolutional Code

Example :

X0 X1 X2 X3

10111=cg=1101

Input : a=11000 a(x)=1+x1

c(x)= c1(x)+c2(x) = 1‧g(x)+x ‧g(x) =(1+x)‧g(x) =a(x)‧g(x)

Generator : g=1101 g(x) = 1+x1+x3

Output : c=10111

00011

c(x) = 1+x2 +x3+x4 0 1 1 0 1

1 1 0 1 0

0 1 2 3 4

time

1 0 1 1 1

X0 X1 X2 X3

1+1+0=0t=1

0011

a=01000

a=10000

p11.

Convolutional Code

Theorem 8.1.8 a(x) : the input sequence of a s-stage shift register c(x) : the output sequence of a s-stage shift register g(x) : the generator of a s-stage shift register,

g(x)=g0+g1x+….+gsxs-1 , where

c(x)=a(x)g(x)

}.1,0{ig

Example 8.1.3 :

Input sequence : 1010000 a(x)=1+x2

Generator : g(x)=1+x+x3

Output sequence : 1110010

c(x)=a(x)g(x)

= (1+x2)(1+x+x3)

=1+x+x2+x5

0000101

X0 X1 X2 X3

1110010=cg=1101

p12.

Convolutional Code

time

0-10

1

2

3

4

5

6

input

a0

a1

a2

a3

0

0

0

0 0 0a0 0 0 0a1 a0 0 0

a2 a1 a0 0

a3 a2 a1 a0

0 a3 a2 a1

0 0 a3 a2

0 0 0 a3

a0

a1+ a0

a2+ a1

a3+ a2+ a0

a3+ a1

a2

a3

Example 8.1.4 :Input sequence : a0, a1, a2, a3, 0, 0, 0

Generator : 1+x+x3

Output sequence : c(x)=a(x)g(x)= (a0+a1x+a2x2+a3x3)(1+x+x3)

= a0+(a1+a0)x+(a2+a1)x2

+ (a3+a2+a0)x3+(a3+a1)x4 + a2x5+a3x6

output=X0+X1+X3X0 X1 X2 X3

a0 a0 0 a0 0 0 0

0 a1 a1 0 a1 0 0

0 0 0 a3 a3 0 a3c(x)

0 0 a2 a2 0 a2 0

p13.

Convolutional Code

s-stage feedback shift register(FSR) Xi(t) : the value of the contents of register Xi at time t.

ct : the output of time t.

input

X0 X1 Xs-1

output

=

ct

gs-1g1g0

1

Example :

X0 X1 X2

0 1 1 0

t

X0 X1 X2

0 1 1 1

t+1

0+11+1

g0=1, g1=1, g2=0

p14.

Convolutional Code

input ct=output

Input : 11000Initial value : 000

Example :

time input X0+ct X1+ct X2+ct

-1 0 0 0

output=ct

0 1 01 0 0

1 1 01 1 0

2 0 00 1 1

3 0 10+1 0+1 1

4 0 10+1 1+1 1

p15.

Convolutional Code

Generator of feedback shift register g(x) : generator of s-stage FSR, g(x)=g0+g1x+….+gs-1xs-

1+xs ,

where and are the coefficients of the shift register.

}1,0{ig

Example : 3-stage feedback shift register

g0=1 , g1=1, g2=0

g = 1101

g = g0g1 … gs-11

g(x)=g0+g1x+…+gs-1xs-1 + xs

g(x) = 1+x1+x3

input

X0 X1 Xs-1output

gs-1g1g0

input output

p16.

Convolutional Code

Polynomial Division

input

X0 X1 Xs-1

output

gs-1g1g0

g(x)=g0+g1x+…+gs-1xs-1 + xs

ct(x)=at(x)/g(x)

rt(x)=at(x) mod g(x)

Input : a=a0a1a2…an

Output : c=c0c1c2…cn

at(x)=a0xt+a1xt-1+…+at-1x+at

ct(x)=c0xt+c1xt-1+…+ct-1x+ct

time=t

rt(x)=X0(t)+X1(t)x+…+Xs-1(t)xs-1

(0≦t≦n-1)

at(x)=ct(x)g(x)+rt(x)

p17.

Convolutional CodeExample : 3-stage feedback shift register

g(x) = 1+x1+x3

000001 output

time input X0+ct X1+ct X2

-1 0 0 0

output=ct

0 1 01 0 0

1 0 00 1 0

2 0 00 0 1

3 0 10+1 0+1 0

4 0 00 1 1

5 0 10+1 0+1 1

a0(x)=1 c0(x)=0 , r0(x)=1

a1(x)=x

a2(x)=x2

c1(x)=0 , r1(x)=x

a3(x)=x3

a4(x)=x4

a5(x)=x5

c2(x)=0 , r2(x)=x2

c3(x)=1 , r3(x)=1+x

c4(x)=x , r4(x)=x+x2

c5(x)=x2+1 , r4(x)=1+x+x2

Input : a=100000

at(x)=x‧at-1(x) =x‧ct-1(x)g(x)+x‧rt-1(x)

t≧0

p18.

Convolutional Code

(n=5,k=2,d=3) cyclic linear code

354

3

2

)(mod

)(mod

)(mod

)(mod

1

xgx

xgx

xgx

xgx

Example 8.1.9 :

Generator polynomial : g(x)= 1+x+x3

Parity check matrix H :

354

3

2

1

0

)(

)(

)(

)(

)(

xr

xr

xr

xr

xr

3-stage feedback shift register

Generator polynomial : g(x)= 1+x+x3

rt(x)=at(x) mod g(x) : 0≦t≦4 ( n-1=5 )

Input : 10000 ( length=5 )

35110

011

100

010

001

==

p19.

Convolutional Code

Generator polynomial : g(x)= 1+x+x3

time

0-101234

input

10110

0 01 0 00 1 01 0 1

1+1 1+1 00 0 0

00010

output

Example 8.1.10 :

g(x) = 1+x1+x3

01101 00010

a=10110

a(x)=x+x2+x4 c(x)=x

r=000

r(x)=0

X0+ct X1+ct X2

p20.

Convolutional Code

[2] Encoding convolutional codes (n,k,m) convolutional code ( (n,k,m)CV )

m : (m+1)-stage shift register

n : the number of (m+1)-stage shift registers ((m+1)-SRi)

generator : k : shift k bits into (m+1)-stage shift register

Input of (m+1)-SRi :

Output of (m+1)-SRi : ci(x)=m(x)gi(x)

nixKxgxgxggxg im

miiii 1 ],[)(,)( ,1,0,

][)( 10 xKxmmxm

(n,k,m)CV={c(x)|(c1(x),c2(x),…,cn(x))}

p21.

Convolutional Code

c1=c1,0c1,1c1,2c1,3 …g1(x)=g1,0+g1,1x+…+g1,mxm

c2=c2,0c2,1c2,2c2,3 …g2(x)=g2,0+g2,1x+…+g2,mxm

cn=cn,0cn,1cn,2cn,3 …gn(x)=gn,0+gn,1x+…+gn,mxm

(m+1)-SR1

(m+1)-SR2

(m+1)-SRn

(n,k,m)CV ={c(x)|(c1(x),c2(x),…,cn(x))}Input X0 X1 Xm+1

p22.

Convolutional Code

Example :(n=2,k=1,m=3) CV

g1(x)=1+x+x3

m(x)

c1(x)

][)( xKxm (2,1,3) CV={c(x)=(m(x)‧(1+x+x3) , m(x)‧(1+x2+x3)) | }

c2(x)g2(x)=1+x2+x3

g1(x)=1+x+x3

m(x)

c1(x)

c2(x)g2(x)=1+x2+x3

p23.

Convolutional Code

(a) The message m(x)=1+x2 is encoded to c(x)=((1+x2)g1(x), (1+x2)g2(x))

=(1+x+x2 +x5, 1+x3+x4 +x5)

(b) The message m(x)=1+x2 +x3+…=1+x2+ is

encoded to

c(x)=(1+x3+x4 +x5+…, 1+x+x3+x4+x5+…)

)10011100,11100100(

3i

ix

)110111,100111()1,1(33

i

i

i

i xxx

Example 8.2.1 :(2,1,3) CV

g1(x)=1+x+x3

m(x)

c1(x)

c2(x)g2(x)=1+x2+x3

][)( xKxm

(2,1,3) CV

={c(x)=(m(x)‧g1(x), m(x)‧g2(x)}

p24.

Convolutional Code

convolutional codes are linear codes (n,k,m) convloutional code : (n,k,m)CV Two codewords : c(x),c’(x)

c(x)+c’(x)=(c1(x),c2(x),…,cn(x))+ (c1’(x),c2

’(x),…,cn’(x))

=(m(x)g1(x),…, m(x)gn(x))+ (m’(x)g1 (x),…, m’(x)gn

(x))

=((m(x)+m’(x))g1(x),…, (m(x)+m’(x))gn(x))

=(m’’(x)g1(x),…,m’’(x)gn(x))

= (c1’’(x),c2’’(x),…,cn’’(x))

=c’’(x)CVmkn ),,(

CVmkn ),,(

p25.

Convolutional Code

Interleaved form of a (n,k,m) CV (n,k,m)CV={c(x)|(c1(x),c2(x),…,cn(x))}

ci(x)=ci,0+ci,1x+ci,2x2+ci,3x3+… , , 1≦i≦n , and 0≦j

c(x)=( c1,0 + c1,1 x + c1,2 x2 + c1,3 x3 + … ,

c2,0 + c2,1 x + c2,2 x2 + c2,3 x3 + … ,

c3,0 + c3,1 x + c3,2 x2 + c3,3 x3 + … ,

Kc ji ,

cn,0 + cn,1 x + cn,2 x2 + cn,3 x3 + … )

c = c1,0 c2,0 … cn,0 , c1,1 c2,1 … cn,1 , c1,2 c2,2 … c n,2 , ……

c(x)=c1(xn)+c2(xn)‧x +c3(xn)‧x2+c4(xn)‧x3+….

interleaved form :

1 x1 xn-1 xn… xn+1 … x2n ……Poly. terms : x2n+1 x2n+2

p26.

Convolutional Code

The message m(x)=1+x2 is encoded to c(x)=(c1(x),c2(x))=((1+x2)g1(x), (1+x2)g2(x))

=(1+x+x2 +x5, 1+x3+x4 +x5) (11100100… , 10011100…)

The interleaved representation of c and c(x) are

c=11 10 10 01 01 11….

c(x)= c1(x2)+c2(x2)‧x

= 1+x2+x4+x10+(1+x6+x8+x10)x=1+x+x2+x4+x7+x10+x11

Example 8.2.5 :(2,1,3) CV

g1(x)=1+x+x3

m(x)

c1

c2

g2(x)=1+x2+x3

][)( xKxm

(2,1,3) CV

={c(x)=(m(x)‧g1(x), m(x)‧g2(x)}

p27.

Convolutional Code

The rate of a (n,k,m) convolutional code The rate of a (n,k,m)CV is defined to be k/n.

convolutional code with k>1

Example 8.2.7 :(3,2,3) CVg1(x)=1+x3, g2(x)=1+x+x3, g3(x)=x+x2+x3

input

c2

c3

c1

time input X0 X1 X2 X3

-1 0 0 0 0

0 0 1 0 1 0 0

1 1 0 1 0 0 1

2 1 0 1 0 1 0

3 1 1 1 1 1 0

4 0 0 0 0 1 1

c1

0

0

1

1

1

c2 c3

5 0 0 0 0 0 0 0

Input : m=10 01 01 11 00 00 ….

1

0

1

0

1

0

1

1

1

0

0

0

p28.

Convolutional Code

Property

+

00a3a2a1a0 a1 a0 0 0a3 a2 a1 a0

Example :(1,2,3) CV

X0 X1 X2 X3

g (x)=1+x+x3

0 0 0 0

c

)()()( xcxgxa

00a3a2

a1+a0

a

input output

a3+a2+a0

2202310

332210

)()(

)1)((

xaxaaaaa

xxaxxaxaa

00 0 0 a3 a2 a2

p29.

Convolutional Code

+g (x)=1+x+x3

c

g1 (x)=1

0a3a1

X0 X2

0a2a0

X1 X3

g2 (x)=1+xc2

c1

+ c

+

+00a3a2a1a0

c1(x) =(a1+a3x)‧1

c2(x) =(a0+a2x)‧(1+x)= a0+(a2+a0)x+a2x2

c(x)=c1(x)+c2(x) =(a1+a0)+(a3+a2+a0)x+a2x2

X0 X1 X2 X3

p30.

Convolutional Code

Generator matrix of a (2,1,m)CV Generator matrix : G Input sequence : m=m0m1m2… Output sequence : c=m‧G (interleaved form)

c1=c1,0c1,1c1,2c1,3 …g1(x)=g1,0+g1,1x+…+g1,mxm

c2=c2,0c2,1c2,2c2,3 …g2(x)=g2,0+g2,1x+…+g2,mxm

Input m

(2,1,m)CV={c|c=m‧G}

c=c1,0c2,0 c1,1c2,1 …

g1,0 g1,1 g1,m

g2,0 g2,1 g2,m

p31.

Convolutional Code

(1) input : m=m0

c1,0 = m0‧g1,0

c2,0 = m0‧g2,0

c = [c1,0 c2 ,0]=[m0‧ g1,0 m0‧g2,0] = m0[g1,0 g2,0]

output : c=c1,0 c2,0

p32.

Convolutional Code

c1,0=m0‧g1,0

c2,0=m0‧g2,0

(2) input : m=m0m1

output : c=c1,0 c2,0 c1,1c2,1

c1,1=m0‧g1,1+m1‧g1,0

c2,1=m0‧g2,1+m0‧g2,0

00,1

10

gmm

00,2

10

gmm

0,2

1,210 g

gmm

0,1

1,110 g

gmm

c=[c1,0 c2,0 c1,1c2,1]

0,20,1

1,21,10,20,110 00 gg

ggggmm

p33.

Convolutional Code

(3) input : m=m0m1m2 … me

output : c=c1,0 c2,0 c1,1c2,1 c1,2c2,2 ….

g1,0 g2,0 g1,1 g2,1

g1,0 g2,0

g1,2 g2,2

g1,1 g2,1

g1,m-2 g2,m-2

g1,m-2 g2,m-2 g1,m-1 g2,m-1

g1,0 g2,0 g1,m-3 g2,m-3 g1,m-2 g2,m-2 g1,m-1 g2,m-1

0

0

ex2e

G=

p34.

Convolutional Code

Example :(2,1,3) CV

Generator : g1(x)=1+x+x3

g2(x)=1+x2+x3

Input : m=101000

1 1 0 1

1 0 1 1

1 1 1 0

1 1 1 0

1 1

0 1

1 0

1 1

0 1

1 1

1 0

1 1

0 1

1 1

1 0

1 1

0 1

1 1

c=m‧G

=[ 1 0 1 0 0 0 ]

0

0

6x12

=[ 1 1 1 0 1 0 0 1 0 1 1 1 ]

p35.

Convolutional Code

state diagram of (n,1,m) convolutional code state : the contents of the first m registers in the shift

register.

(s=s0s1….sm-1) zero state : each of the first m registers contains 0.

input

X0 X1Xm

s0 s1 sm

mi miKss 210,|#

Xm-1

sm-1

Example :(1,1,m) CV

time t

s0s1….sm-1

time t+1

0s0s1….sm-2

1s0s1….sm-2

or

Input=0 or 1

g(x)=g0+g1x+…+gmxm

time t-1

s1s2….sm-10

s1s2….sm-11

Input=s0

100

001

000 01000

11

11

10

01

00110

011

101 11111

10

01

00

00

10

01

01

10

11

g1(x)=1+x+x3

c1

c2

g2(x)=1+x2+x3

m

Example 8.2.9 :

(2,1,3)CV:

State diagram :

000

ttime

state

t+1

c1c2

000

100

input 0

input 1

0 0 0 0

1 0 0 0

X0 X1 X2 X3 c1c2=output

0 0

input

0

1 1 1

100

001

000 01000

11

11

10

01

00110

011

101 11111

10

01

00

00

10

01

01

10

11

g1(x)=1+x+x3

c1

c2

g2(x)=1+x2+x3

m

time -1

state 000

0

input 1

10011

Example 8.2.9 : encoding convolutional code by state diagram

Input : m=101

(2,1,3)CV:

State diagram :

1

0

01010

2

1

01010

Tabular form A state diagram of a (n,1,m)CV can also be represented in

tabular form.

100

001

000 01000

11

11

10

01

00110

011

101 11111

10

01

00

00

10

01

01

10

11

0000X 1X 2X

Output

001010011100101110111

State

03 X 13 X

0001101111100100

1110010000011011

X0X1X2X3

0 1 1 0 1 1 0 0 1 1

0 1 1 1 1 1 1 0 1 1

X1X2X3 X0X1X2

Example : (2,1,3)CV

p39.

Convolutional Code

[3] Decoding convolutional codes Idea of decoding convolutional codes

Consider C1 in example 8.2.1. Suppose that the received word is *

But there is no directed walk that would give an output of w. Therefore we are faced with finding a codeword that “most likely” (minimum hamming

distance) fits w .

encoder of convolution

code

noise

21wwwchannel

decoder of convolution

code

. 00 00 00 111)( wxxw

p40.

Convolutional Code

. Window size τWindow size is the amount of received codeword w we “see”

when making each decoding decision.

Hamming distance : Hd(w1,w2). Hamming weight : Hw(w1).

p41.

Convolutional Code

Exhaustive decoding algorithm (window size τ=1) input : received word w=11 10 10 01 output : correct decoding message 1 0 1 0

100

001

000 01000

11

11

10

01

00110

011

101 11111

10

01

00

00

10

01

01

10

11

Walk : 011101010100000 01101011

(1) (2) (3)

(4)

The length of the walk : 4

p42.

Convolutional Code

input : received word w=11 00 00 00 output : most likely decoding message 1 1 1 0 *

(1) : walk and decode message digit 1.100000 11(2) : walk 010100 10

110100 01?

1)10,00( dH

1)01,00( dHRandom choosing one to decode message digit. (say, choose 110)

100

001

000 01000

11

11

10

01

00110

011

101 11111

10

01

00

00

10

01

01

10

11(1)

(2)

(4)

(3)

(3) : walk and decode message digit 1.111110 00(4) : walk and decode message digit 0.011111 00

p43.

Convolutional Code

Exhaustive decoding algorithm (window size τ=2)input : received word w=11 00 00 00…… and window size τ=2output ”: most closely” message(tick 0) We start at state 000 in Example 8.2.9 state diagram.

(tick 1) we see w=11 00

We make the decoding decision to move to state 100 and

decode first message digit as 1.

000, 000, 000

000, 000, 100

000, 100, 010

000, 100, 110

00 00

00 11

11 10

11 01

2

4

1

1

walk output Distance from 11 00

p44.

Convolutional Code

(tick 2) we see w=00 00

We make the decoding decision to move to state 110 and

decode first message digit as 1.

100, 010, 001

100, 010, 101

100, 110, 011

100, 110, 111

10 01

10 10

01 11

01 00

2

2

3

1

walk output Distance from 00 00

p45.

Convolutional Code

Catastrophic (n,1,m) CV If its state diagram contains a zero weight cycle

different form the loop on the zero state. For n=2, gcd(g1(x), g2(x)) !=1 if and only if

the (2,1,m) CV is catastrophic.

Example :

g1(x)=1+x3=(1+x)(1+x+x2)gcd(g1(x),g2(x))=1+x+x2

g2(x)=x+x2+x3=x(1+x+x2)

p46.

Convolutional Code

The minimum distance of a convolutional code We are only considering non-catastrophic

convolutional codes.

100

001

000 01000

11

11

10

01

00110

011

101 11111

10

01

00

00

10

01

01

10

11

(2,1,3) CV=C1Example :

d(C1)=Hw(11 10 01 11 00 00 …)=6

p47.

Convolutional Code

τ(e)

Given a non-catastrophic convolutional code C for

define τ(e) to be the least integer x such that all walks of length x

in the state diagram that immediately leave the zero state have weight greater than 2e.

,2

11

d

e

p48.

Convolutional Code

Theorem 8.3.4 Let C be a non-catastrophic convolutional code. For any e, if any error pattern containing at most e errors in any τ(e)

consecutive steps occurs during transmission, then the exhaustive decoding algorithm using the window size τ(e) will decode the received word correctly.

,2

11

d

e

100

001

000 01000

11

11

10

01

00110

011

101 11111

10

01

00

00

10

01

01

10

11

(2,1,3) CV=C1Example :

e=1

2)1(22 eHw(11 10)=3

Hw(11 02)=3

p49.

Convolutional Code

100

001

000 01000

11

11

10

01

00110

011

101 11111

10

01

00

00

10

01

01

10

11

e=2

Hw(11 10 00 00 10 00)=4 (Hamming weight on 6 red edges)

7)2( Choose

(since the Hamming weight of any walk of length 7 > 2e=4)

p50.

Convolutional Code

How many errors can be corrected? Theorem 8.3.4 says that if we use the exhaustive

decoding algorithm with window size τ(1) , then all error patterns with at most e=1 error in any τ(1) =2 consecutive ticks will be corrected. So for example, the error pattern

e1 = 10 00 01 00 01 00 10 … will be corrected. Also if we use the exhaustive decoding algorithm with

window size τ(2), then all error patterns with at most e=2 errors in any

τ(2)=7 consecutive ticks will be corrected. So for example, the error pattern

e2 = 11 00 00 00 00 00 00 … will be corrected.

p51.

Convolutional Code

Exhaustive decoding algorithm vs truncated Viterbi decoding algorithm

Notice that the exhaustive decoding algorithm with window size τ(e) requires that we consider all walks of length to be the least integer x such that all walks of length τ(e) from the current state for each message digit to be decoded. Constructing all 2^τ(e) such walks at each tick is very time consuming, so we will present a faster truncated Viterbi decoding algorithm (dynamic programming approach) in the next section.

p52.

Convolutional Code

[4] Truncated Viterbi decoding This algorithm only makes 2m calculations and stores 2m

walks of length τat each tick. The window size τ is chosen

to be between 4m to 6m (a number more than τ(e)). For the first m ticks the decoder is still storing all walks from the

zero state, each ending in a different state, so t=m is the first time at

which we have exactly one walk ending in this state. For t>m, each state s saves an optimal walk W(s;t) and its corresponding distance d(s;t). Once t>= τ, a message digit is decoded at each tick. W(s;t)=x0 x1… xτ-1 : optimal walk from current decoded state to state s at tick t (stored as a sequence of message digits, rather than a sequence of states). d(s;t) : distance between the outputs of W(s;t) and the

corresponding received words

p53.

Convolutional Code

Algorithm 8.4.1 : truncated Viterbi decoding of (n,1,m) convolutional

codes with windows size τ

Input : received word w=w0w1…..

each wi consists of n digits Output ”: most closely” message

s : state s = s0s1…sm-1

(1) Initialization : t=0 , define

W(s ; t)= s0s1…sm-1 … (of length τ)﹡﹡ ﹡

,

,0) ; ( tsd

if s is the zero state

otherwise

(a)

(b)

p54.

Convolutional Code

(2,1,3) CV code, τ= 5 (in Example 8.2.1)

W(000,0)=000** d(000,0)=0

W(100,0)=100** d(100,0)=∞

Example :

W(010,0)=010** d(010,0)=∞

W(110,0)=110** d(110,0)=∞

W(001,0)=001** d(001,0)=∞

W(101,0)=101** d(101,0)=∞

W(011,0)=011** d(011,0)=∞

W(111,0)=111** d(111,0)=∞

p55.

Convolutional Code

(2) Distance calculation : t>0

For each state s, define

) ; ( tsd

: the distance between the input wt-1 and the output on the directed edge from state (s1,…,sm-1,i) to s, i=0,1.

)(sd i(a)

(b)

),()1 ; 0,,,,(min{ 0121 sdtsssd m

)}()1 ; 1,,,,( 1121 sdtsssd m

p56.

Convolutional Code

Example :

w=w0w1w2w3…=00 11 01 11 ….

(2,1,3) CV code, τ= 5 (in Example 8.2.1)

S=S0S1S2=011

When t=1,

011110 11i=0

011111 00i=1

2)00,11()011(0 dd

0)00,00()011(1 dd

)}011()0;111(),011()0;110(min{)1;011( 10 ddddd }0,2min{

p57.

Convolutional Code

(3) Walk calculation :

(a) If

form W(s,t) from by adding the leftmost

digit of s to the left of and then deleting

the rightmost digit.

)()1 ; ,,,,()()1 ; ,,,,( 121121 sdtjsssdsdtisssd jmim

)1 ; ,,,,( 121 tisssW m

)1 ; ,,,,( 121 tisssW m

iiim xxxtisssW 110121 )1 ; ,,,,(

S=s0s1…sm-1

jjjm xxxtjsssW 110121 )1 ; ,,,,(

iii xxxstsW 2100) ; (

Example :

{i,j}={0,1}

p58.

Convolutional Code

(b) If

W(s,t) from by adding the leftmost

] digit of s to the left of , replacing each

digit that disagrees with with ﹡, and

then deleting the rightmost digit.

)()1;1,,,,()()1;0,,,,( 11210121 sdtsssdsdtsssd mm

)1;0,,,,( 121 tsssW m)1;0,,,,( 121 tsssW m

)1;1,,,,( 121 tsssW m

1,...,1,0, ,

,

*

kxx

xxxx

jk

ik

jk

ik

ik

kwhere

,) ; ( 2100 xxxstsW

Example :

iiim xxxtisssW 110121 )1 ; ,,,,(

S=s0s1…sm-1

jjjm xxxtjsssW 110121 )1 ; ,,,,(

Then

p59.

Convolutional Code

(4) Decoding :

For t≧τ, let If the rightmost digit in W(s;t) is the same, say i, for all

then decode the message digit i; otherwise decode the message

digit .﹡

}. states all for ) ; () ; (|{)( ' 'stsdtsdstS )(tSs

00000001001000010100011010000011000101100001110001111000

Artificial Examples :

000

100

010

110

001

101

011

111

S d(s,t) W(s,t)

22334433

S(t)={000,100}Decode to 0

00000001001001010100011010000011000101100001110001111000

000

100

010

110

001

101

011

111

S d(s,t) W(s,t)

22334433

S(t)={000,100}Decode to *

τ=7

(1) (2)

p60.

Convolutional Code

2. E.g 8.4.2 : consider convolutional code C1 in Example 8.2.1

State diagram of (2,1,3) convolutional code in Example 8.2.1 :

Received word : w=w0w1w2…..=11 00 00 …..τ=7

100

001

000 01000

11

11

10

01

00110

011

101 11111

10

01

00

00

10

01

01

10

11

p61.

Convolutional Code

0000X 1X 2X

Output

100

010

110

001

101

011

111

State

03 X 13 X

00

01

10

11

11

10

01

00

11

10

01

00

00

01

10

11

t=0

t=0

W(s;0)=s for all state s;﹡﹡﹡﹡

d(000;0)=0 and d(s’;0)= ∞ for all states s’ other than the zero state.

0, 000****∞, 100****∞, 010****∞, 110****∞, 001****∞, 101****∞, 011****∞, 111****

) ; ( tsd W(s;t)

p62.

Convolutional Code

t=1 : wt-1=w0=11

s=000 :

2},2min{

)}000()0,001(),000()0;000(min{)1;000( 10

ddddd

2)11,00()000(0 dHd

0)11,11()000(1 dHd)000000( 00)000001( 11

s=010 :

}1,1min{

)}010()0,101(),010()0;100(min{)1;010( 10 ddddd

1)11,10()010(0 dHd1)11,01()010(1 dHd

)010100( 10)010101( 01

s0W(000,0)=0000****W(000,1)=0000***

s0W’(010,0)=010***** W(100,0)=100****W(101,0)=101****W’(010,0)=10*****

W(010,1)=010***

input = 0 =S0

input = 0 =S0

p63.

Convolutional Code

0000X 1X 2X

Output

100

010

110

001

101

011

111

State

03 X 13 X t=0

0, 000****∞, 100****∞, 010****∞, 110****∞, 001****∞, 101****∞, 011****∞, 111****

t=1

2, 0000***0, 1000***∞, 010****∞, 110****∞, 001****∞, 101****∞, 011****∞, 111****

00

01

10

11

11

10

01

00

11

10

01

00

00

01

10

11

p64.

Convolutional Code

0000X 1X 2X

Output

100

010

110

001

101

011

111

State

03 X 13 X t=2

2, 00000**4, 10000**1, 01000**1, 11000**∞, 001****∞, 101****∞, 011****∞, 111****

t=2,3 : w1=00, w2=00

t=3

2, 000000*4, 100000*5, 010000*5, 110000*2, 001000*2, 101000*3, 011000*1, 111000*

00

01

10

11

11

10

01

00

11

10

01

00

00

01

10

11

p65.

Convolutional Code

t=4 : wt-1=w3=00

s=100 :

2}02,22min{

)}000()3,001(),000()3;000(min{)4;000( 10

ddddd

2)00,11()000(0 dHd

0)00,00()000(1 dHd)100000( 11)100001( 00

s=010 :

3}12,14min{

)}010()3,101(),010()3;100(min{)4;010( 10

ddddd

1)11,10()010(0 dHd1)11,01()010(1 dHd

)010100( 10)010101( 01

s0W(001,3)=1001000*

W(100,4)=1001000

input = 1 =S0

s0W(101,3)=0101000*

W(010,4)=0101000

input = 0 =S0

p66.

Convolutional Code

0000X 1X 2X

Output

100

010

110

001

101

011

111

State

03 X 13 X t=4

2, 00000002, 10010003, 01010003, 11010004, 00110004, 10110001, 01110003, 1111000

t=3

2, 000000*4, 100000*5, 010000*5, 110000*2, 001000*2, 101000*3, 011000*1, 111000*

00

01

10

11

11

10

01

00

11

10

01

00

00

01

10

11

p67.

Convolutional Codet=5 : wt-1=w4=00

S=000 :

2}24,02min{

)}000()4,001(),000()4;000(min{)5;000( 10

ddddd

2)11,00()000(0 dHd0)11,11()000(1 dHd

)000000( 00)000001( 11

s0W(000,4)=00000000W(000,5)=0000000

s0W’(010,0)=100**000 W(000,4)=0000000W(001,4)=0011000W’(010,0)=00**000

W(010,1)=100**00

input = 0 =S0

S=100 :

4}04,22min{

)}000()4,001(),000()4;000(min{)5;000( 10

ddddd

2)00,11()000(0 dHd0)00,00()000(1 dHd

)100000( 11)100001( 00

input = 1 =S0

p68.

Convolutional Code

0000X 1X 2X

Output

100

010

110

001

101

011

111

State

03 X 13 X t=6

2, 00000002, 10011103, 01011103, 11011104, 001**104, 101**103, 01110103, 1110010

t=5

2, 00000004, 100**003, 01001003, 11001002, 00111002, 10111003, 01111003, 1110100

00

01

10

11

11

10

01

00

11

10

01

00

00

01

10

11

t=7

2, 00000004, 100****3, 01001113, 11001114, 001*1*14, 101*1*13, 01110113, 1110111

t=7 : wt-1=w6=00. We have reached t=τ.

So we decode the rightmost digit in W(000,7)=0000000, namely 0.

Since d(000,7)=2<d(s,7), s=001~111 ( S(7)={000} )

p69.

Convolutional Code

0000X 1X 2X

Output

100

010

110

001

101

011

111

State

03 X 13 X t=9

2, 00000004, 100***05, 010****5, 110****

4, 00111014, 10111013, 01110015, 111****

t=8

2, 00000004, 100****5, 010****5, 110****4, 001****4, 101****3, 01110113, 1110011

00

01

10

11

11

10

01

00

11

10

01

00

00

01

10

11

t=10

2, 00000004, 100****5, 010****5, 110****

4, 00111004, 10111005, 0111***5, 1110***

Decode to : 0 0 0