Error Control CodingError Control Codinggg
Wireless Information Transmission System Lab.Wireless Information Transmission System Lab.Institute of Communications EngineeringInstitute of Communications Engineeringg gg gNational Sun National Sun YatYat--sensen UniversityUniversity
IntroductionIntroduction
◊ Error Detecting Codes: Capability of detecting errors so that re transmission or dropping can be donethat re-transmission or dropping can be done.◊ Cyclic Redundancy Code (CRC)
◊ Error Correcting Codes: Capability of detecting and correcting errors.◊ Block Codes: Cyclic code, BCH code, RS code, … etc.◊ Convolutional code◊ Turbo code◊ Low Density Parity Check (LDPC) Code
2
IntroductionIntroduction
◊ In general, if the channel quality is good (e.g. line transmission) error detection is preferred In this casetransmission), error detection is preferred. In this case, most of the packets can be received correctly and the backward error correction (BEC) scheme can be adoptedbackward error correction (BEC) scheme can be adopted, i.e. the packet is re-transmitted once there is an error.
◊ If the channel quality is poor (e.g. wireless transmission), error correction is preferred. In this case, almost all the packets are received erroneously and the forward error correction (FEC) scheme can be adopted, i.e. the error correction scheme is applied to every packet.
3
Cyclic Redundancy Code (CRC)Cyclic Redundancy Code (CRC)
◊ The sender and receiver must agree upon a generator polynomial G(x) in advancepolynomial, G(x), in advance.
4
Cyclic Redundancy Code (CRC)Cyclic Redundancy Code (CRC)
5
Cyclic Redundancy Code (CRC)Cyclic Redundancy Code (CRC)
◊ Examples of CRCs used in practice:
◊ A 16-bit checksum catches all single and double errors, all errors with an odd number of bits, all burst errors of length 16 or less, 99.997% of 17-bit error bursts, and 99.998% of 18-bit and longer bursts.
6
Linear Block CodesLinear Block Codes
◊ Encoder transforms block of k successive binary digits into longer block of n (n>k) binary digits.
◊ Called an (n,k) code.◊ Redundancy = n-k; Code Rate = k/n;◊ There are 2k possible messages.p g◊ There are 2k possible code words corresponding to the
messages.g◊ Code Word (or code vector) is an n-tuple from the
space Vn of all n-tuple.p n p◊ Storing the 2k code vector in a dictionary is prohibitive
for large k.
7
g
Vector SpacesVector Spaces
◊ The set of all binary n-tuples, Vn, is called a vector space over GF (2).
◊ GF: Galois Field.◊ Two operations are defined:
◊ Addition: UVUVUVUV ++++++=+ 2211◊ Addition: ◊ Scalar Multiplication:
◊ Example: Vector Space V
nn UVUVUVUV +++++++ ...2211
naVaVaVVa +++= ...21
◊ Example: Vector Space V4◊ 0000 0001 0010 0011 0100 0101 0110 0111
1000 1001 1010 1011 1100 1101 1110 11111000 1001 1010 1011 1100 1101 1110 1111◊ (0101)+(1110)=(0+1, 1+1, 0+1, 1+0)=(1, 0, 1, 1)
1 (1010) (1 1 1 0 1 1 1 0) (1 0 1 0)
8
◊ 1·(1010)=(1·1, 1·0, 1·1, 1·0)=(1, 0, 1, 0)
SubspacesSubspaces
◊ A subset S of V is a subspace if◊ A subset S of Vn is a subspace if◊ The all-zero vector is in S◊ The sum of any two vectors in S is also in S◊ The sum of any two vectors in S is also in S.
◊ Example of S:
01010000
1
0
==
VV
10100101
2
1
=VV
11113 =V
9
Reducing Encoding ComplexityReducing Encoding Complexity
◊ Key feature of linear block codes: the 2k code vectors yform a k-dimensional subspace of all n-tuples.
◊ Example: k = 3, 2k = 8, n = 6, ( 6 , 3 ) codep , , , ( , )Message Code Word0 0 0 0 0 0 0 0 0
⎪⎫
1 0 0 1 1 0 1 0 00 1 0 0 1 1 0 1 0
⎪⎪⎪⎪
1 1 0 1 0 1 1 1 00 0 1 1 0 1 0 0 1 tuples.-6 all of space vector the
ofsubspaceldimensiona-3A
⎪⎪
⎪⎪⎬
1 0 1 0 1 1 1 0 10 1 1 1 1 0 0 1 11 1 1 0 0 0 1 1 1 ⎪
⎪⎪⎪
⎭
10
1 1 1 0 0 0 1 1 1 ⎪⎭
Reducing Encoding ComplexityReducing Encoding Complexity
◊ -ntindependenlinearlykofsetafindtopossibleisIt◊
fbi iliisuspace theof tuple-neach such that ..., , , tuples
nt independenlinearly k ofset afindtopossible isIt
21 kvvv...., ,,ofn combinatiolinear ais 21 kvvv
◊
mvmvmvmu
i
kk
1or0where... wordCode 221 1
=+++=
kimi
,...,1 1or 0 where
=
11
Generator MatrixGenerator Matrix
112111⎥⎤
⎢⎡
⎥⎤
⎢⎡ nvvvv
MatrixGenerator n k 222212 ×=
⎥⎥⎥⎥
⎢⎢⎢⎢
=
⎥⎥⎥⎥
⎢⎢⎢⎢
= nvvvvG
◊ The 2k code vectors can be described by a set of k linearly21
⎥⎦
⎢⎣
⎥⎦
⎢⎣ knkkk vvvv
◊ The 2 code vectors can be described by a set of k linearly independent code vectors.
◊ Let m=[m1, m2, … , mk] be a message.[ 1, 2, , k] g◊ Code word corresponding to message m is obtained by:
⎤⎡v1
[ ]⎥⎥⎥⎤
⎢⎢⎢⎡
== k
vv
mmmGmu 2
1
21
12
⎥⎥
⎦⎢⎢
⎣ kv
Generator MatrixGenerator Matrix
◊ Storage is greatly reduced.h d d h k f G i d f◊ The encoder needs to store the k rows of G instead of
the 2k code vectors of the code.◊ For example:
0010111⎥⎤
⎢⎡
⎥⎤
⎢⎡v
[ ]011 and 100101010110Let 2
1
=⎥⎥⎥
⎦⎢⎢⎢
⎣
=⎥⎥⎥
⎦⎢⎢⎢
⎣
= mvvG
Then1001013 ⎥⎦⎢⎣⎥⎦⎢⎣v
[ ] [ ] [ ] [ ]101001001101011101001 011
011321
2
1
⋅+⋅+⋅=⋅+⋅+⋅=
⎥⎥⎤
⎢⎢⎡
=vvv
vv
u
13
[ ] [ ] [ ] [ ][ ]110for Vector Code 0] 1 1 1 0 [1 3
2
==⎥⎥⎦⎢
⎢⎣ mv
Systematic CodeSystematic Code
14
Parity Check MatrixParity Check Matrix
◊ For each generator matrix G, there exists a parity check matrix H g , p ysuch that the rows of G are orthogonal to the rows of H. (u·h=0)
hhhh ⎤⎡⎤⎡ 112111
hhhhhh
hh
H n
n
⎥⎥⎥⎤
⎢⎢⎢⎡
=⎥⎥⎥⎤
⎢⎢⎢⎡
= 22221
11211
2
1
hhhh nknknknkn ⎥⎥⎥
⎦⎢⎢⎢
⎣⎥⎥⎥
⎦⎢⎢⎢
⎣ −−−− )(2)(1)()(
huhuhuHu
uuuu
inniiT
n
=+++=
=
0
,,,
2211
21 …
◊ U is a code word generated by matrix G if and only if uHT=0kni −= ,,2,1 where …
15
◊ U s code wo d ge e ed by G d o y u
Parity Check Matrix and SyndromeParity Check Matrix and Syndrome
◊ In a systematic code with G=[Pkxr Ikxk]◊ In a systematic code with G [Pkxr Ikxk]H=[Irxr PT
rxk]
◊ Received Code Erroru er
= +
◊
Vector Vector Vector
+
r correctionanddetection error for usedofSyndrome◊
THrs = y
◊
⎩⎨⎧≠=
Otherwise0 vectorcode a is If 0
Syndromer
s
16
⎩≠ Otherwise 0
Example of Syndrome TestExample of Syndrome Test=
⎥⎥⎤
⎢⎡ − ] [H
001011
Tkn PI
◊
⎥⎥⎤
⎢⎢⎡
=⎥⎥⎥⎥⎥
⎢⎢⎢⎢⎢
=011010101001
H
100010001
101110011
G
◊ The 6-tuple 1 0 1 1 1 0 is the code vector corresponding to the⎥⎥⎦⎢
⎢⎣
⎥⎦⎢⎣ 110100IP k
◊ The 6 tuple 1 0 1 1 1 0 is the code vector corresponding to the message 1 1 0.
010001
⎥⎥⎥⎤
⎢⎢⎢⎡
[ ] [ ]000
110011100
011101 =
⎥⎥⎥⎥
⎢⎢⎢⎢
•=⋅= THus
◊ Compute the syndrome for the non-code-vector 0 0 1 1 1 0101110⎥⎥⎥
⎦⎢⎢⎢
⎣
17
◊ Co pu e e sy d o e o e o code vec o[ ] [ ]001011100 =•= THs
Weight and Distance of Binary VectorsWeight and Distance of Binary Vectors
◊ Hamming Weight of a Vector:◊ Hamming Weight of a Vector:◊ w(v) = Number of non-zero bits in the vector.
H i Di t b t 2 t◊ Hamming Distance between 2 vectors:◊ d(u,v) = Number of bits in which they differ.◊ For example: u=10010110001
v=11001010101d(u,v) = 5.
◊ d(u,v) =w(u+v)◊ The Hamming Distance between 2 vectors is equal to the
Hamming Weight of their vector sum.
18
Minimum Distance of a Linear CodeMinimum Distance of a Linear Code
◊ The set of all code vectors of a linear code form a subspace of the n-tuple space.
◊ If u and v are 2 code vectors, then u+v must also be a ,code vector.
◊ Therefore, the distance d(u,v) between 2 code vectors , ( , )equals the weight of a third code vector.◊ d(u,v) =w(u+v)=w(w)
◊ Thus, the minimum distance of a linear code equals the minimum weight of its code vectors.
◊ A code with minimum distance dmin can be shown to correct (dmin-1)/2 erroneous bits and detect (dmin-1)
19
erroneous bits.
Example of Minimum DistanceExample of Minimum Distance
d =3
20
dmin=3
Example of Error Correction and Detection Capability
u v
7),(min =vud
S hC iE1min ⎥⎢ −d StrengthCorrectingError : 2
1minmax ⎥⎦
⎥⎢⎣⎢=dt
St thD t tiE1d21
StrengthDetectingError : 1minmax −= dm
ConvolutionalConvolutional Code StructureCode Structure
1 2 k 1 2 k 1 2 k
1 2 K
k bits
+ + ++1 2 n-1 n
Output
22
ConvoltuionalConvoltuional CodeCode
◊ Convolutional codesk b f bit hift d i t th d t ti◊ k = number of bits shifted into the encoder at one time◊ k=1 is usually used!!
n = number of encoder output bits corresponding to the k◊ n = number of encoder output bits corresponding to the kinformation bits
◊ r = k/n = code rate◊ r = k/n = code rate◊ K = constraint length, encoder memory
Each encoded bit is a function of the present input bits and◊ Each encoded bit is a function of the present input bits and their past ones.
23
Generator SequenceGenerator Sequence
◊ u◊
1d101 )1()1()1()1(
r0 r2r1u
v
.1 and ,1 ,0 ,1 )1(3
)1(2
)1(1
)1(0 ==== gggg
Generator Sequence: g(1)=(1 0 1 1)
◊
q g ( )
ur0 r2r1
uvr3
.1 and 0, ,1 ,1 ,1 )2(4
)2(3
)2(2
)2(1
)2(0 ===== ggggg
G t S (2) (1 1 1 0 1)
24
Generator Sequence: g(2)=(1 1 1 0 1)
ConvolutionalConvolutional CodesCodesAn Example An Example –– (rate=1/2 with K=2)(rate=1/2 with K=2)
G1(x)=1+x2 0(00)
x1 x2G2(x)=1+x1+x2
00
0(00)
Present Next Output
001(11)
0(01)
0(11)
00 000 00
1 00 10 11
01 100(01)
1(00)
010
1 01
00
10
11
00 11
0(10) 1(10)
0
1
10 01
10 11
01
101(01)
25
0
1
11
11
01
11
10
01State Diagram
Trellis Diagram RepresentationTrellis Diagram Representation
26
Trellis termination: K tail bits with value 0 are usually added to the end of the code.
Encoding ProcessEncoding Process
Input: 1 0 1 1 1 0 0Output: 11 01 00 10 01 10 11Output: 11 01 00 10 01 10 11
27
ViterbiViterbi Decoding AlgorithmDecoding Algorithm
◊ Maximum Likelihood (ML) decoding rule
received sequence rML detected sequence d
◊ Viterbi Decoding Algorithm
min(d,r) !!
◊ Viterbi Decoding Algorithm◊ An efficient search algorithm
◊ Performing ML decoding rule◊ Performing ML decoding rule.◊ Reducing the computational complexity.
28
ViterbiViterbi Decoding AlgorithmDecoding Algorithm
◊ Basic conceptG t th d t lli t th d d◊ Generate the code trellis at the decoder
◊ The decoder penetrates through the code trellis level by level in search for the transmitted code sequencesearch for the transmitted code sequence
◊ At each level of the trellis, the decoder computes and compares the metrics of all the partial paths entering a nodethe metrics of all the partial paths entering a node
◊ The decoder stores the partial path with the larger metric and eliminates all the other partial paths. The stored partial path iseliminates all the other partial paths. The stored partial path is called the survivor.
29
ViterbiViterbi Decoding AlgorithmDecoding Algorithm
Output: 11 01 00 10 01 10 11Receive: 11 11 00 10 01 11 11
00 00 00 00 00 00 00 000(00) 0(00) 0(00)0(00)0(00) 0(00) 0(00)Receive: 11 11 00 10 01 11 11
2
01 01 01 01 0101 01 01 01 01
10 10 10 10 100
1(01) 1(01) 1(01)
0
30
11 11 11 111(01) 1(01) 1(01)
ViterbiViterbi Decoding AlgorithmDecoding Algorithm
Output: 11 01 00 10 01 10 11Receive: 11 11 00 10 01 11 11
00 00 00 00 00 00 00 000(00) 0(00) 0(00)0(00)0(00) 0(00) 0(00)Receive: 11 11 00 10 01 11 11
2 4
01 01 01 01 0101 01 01 01 011
10 10 10 10 100 2
1(01) 1(01) 1(01)
0 2
31
11 11 11 111(01) 1(01) 1(01)
1
ViterbiViterbi Decoding AlgorithmDecoding Algorithm
Output: 11 01 00 10 01 10 11Receive: 11 11 00 10 01 11 11
00 00 00 00 00 00 00 000(00) 0(00) 0(00)0(00)0(00) 0(00) 0(00)Receive: 11 11 00 10 01 11 11
2 4 3
01 01 01 01 0101 01 01 01 011 2
10 10 10 10 100 2 1
1(01) 1(01) 1(01)
0 2 1
32
11 11 11 111(01) 1(01) 1(01)
1 2
ViterbiViterbi Decoding AlgorithmDecoding Algorithm
Output: 11 01 00 10 01 10 11Receive: 11 11 00 10 01 11 11
00 00 00 00 00 00 00 000(00) 0(00) 0(00)0(00)0(00) 0(00) 0(00)Receive: 11 11 00 10 01 11 11
2 4 3 3
01 01 01 01 0101 01 01 01 011 2 2
10 10 10 10 100 2 1 3
1(01) 1(01) 1(01)
0 2 1 3
33
11 11 11 111(01) 1(01) 1(01)
1 2 1
ViterbiViterbi Decoding AlgorithmDecoding Algorithm
Output: 11 01 00 10 01 10 11Receive: 11 11 00 10 01 11 11
00 00 00 00 00 00 00 000(00) 0(00) 0(00)0(00)0(00) 0(00) 0(00)Receive: 11 11 00 10 01 11 11
2 4 3 3 3
01 01 01 01 0101 01 01 01 011 2 2 3
10 10 10 10 100 2 1 3 3
1(01) 1(01) 1(01)
0 2 1 3 3
34
11 11 11 111(01) 1(01) 1(01)
1 2 1 1
ViterbiViterbi Decoding AlgorithmDecoding Algorithm
Output: 11 01 00 10 01 10 11Receive: 11 11 00 10 01 11 11
00 00 00 00 00 00 00 000(00) 0(00) 0(00)0(00)0(00) 0(00) 0(00)Receive: 11 11 00 10 01 11 11
2 4 3 3 3 3
01 01 01 01 0101 01 01 01 011 2 2 3 2
10 10 10 10 100 2 1 3 3
1(01) 1(01) 1(01)
0 2 1 3 3
35
11 11 11 111(01) 1(01) 1(01)
1 2 1 1
ViterbiViterbi Decoding AlgorithmDecoding Algorithm
Output: 11 01 00 10 01 10 11Receive: 11 11 00 10 01 11 11
00 00 00 00 00 00 00 000(00) 0(00) 0(00)0(00)0(00) 0(00) 0(00)Receive: 11 11 00 10 01 11 11
2 4 3 3 3 3 2
01 01 01 01 0101 01 01 01 011 2 2 3 2
10 10 10 10 100 2 1 3 3
1(01) 1(01) 1(01)
0 2 1 3 3
36
11 11 11 111(01) 1(01) 1(01)
1 2 1 1
ViterbiViterbi Decoding AlgorithmDecoding Algorithm
Decision:11 01 00 10 01 10 11Receive: 11 11 00 10 01 11 11
00 00 00 00 00 00 00 000(00) 0(00) 0(00)0(00)0(00) 0(00) 0(00)Receive: 11 11 00 10 01 11 11
2 4 3 3 3 3 2
01 01 01 01 0101 01 01 01 011 2 2 3 2
10 10 10 10 100 2 1 3 3
1(01) 1(01) 1(01)
0 2 1 3 3
37
11 11 11 111(01) 1(01) 1(01)
1 2 1 1