Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 1Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory
Prof. Dr.-Ing. Andreas Czylwik
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 2Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding TheoryOrganization
Lecture: 2 hours per week Transparencies will be available at:
http://nts.uni-duisburg.de Exercise: 1 hour per week, M.Sc. Bo Zhao New research areas at the Chair of Communication
Systems (Nachrichtentechnische Systeme) Subjects for master theses
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 3Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Textbooks Textbooks for the lecture:
M. Bossert: Channel coding for telecommunications, John Wiley Blahut: Theory and practice of error control codes, Addison-
Wesley J. H. van Lint: Introduction to coding theory B. Friedrichs: Kanalcodierung, Springer-Verlag H. Schneider-Obermann: Kanalcodierung, Vieweg-Verlag
Textbooks in the field of digital communications incl. coding theory: S. Benedetto, E. Biglieri, V. Castellani: Digital transmission
theory, Prentice-Hall J.G. Proakis: Digital communications, McGraw-Hill S. Haykin: Communication systems, John Wiley
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 4Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Contents
Introduction Information theory Channel coding in digital communication systems Algebraic foundations for coding Block codes Convolutional codes Coding techniques Outlook
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 5Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Block diagram of a digital communication system
Coding Theory Introduction
Source Sourceencoder
Channelencoder Modulator
Transmissionchannel Noise
DemodulatorChanneldecoder
SourcedecoderSink
Digital source
Digital sinkDiscrete channel
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 6Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Introduction
Source coding Compression of the communication signal to a minimum number
of symbols without the loss of information (reduction of redundancy)
Further compression when loss of information is tolerated (e.g. video or audio transmission)
Coding for encryption Security against unwanted listening Information recovery only with a secret key
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 7Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Introduction
Error control coding: defined addition of redundancy to improve transmission quality FEC (forward error correction) Simplified model:
Channel quality determines residual error rate after decoding Data rate does not depend on channel quality
Datasource
Channelencoder Channel Channel
decoderDatasink
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 8Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Introduction
Channel coding for error detection (CRC – cyclic redundancy check, used for automatic repeat request methods (ARQ))
Reverse channel necessary Adaptive insertion of redundancy (addition redundancy only in
case of errors) Residual error probability does not depend on channel quality Net data rate (throughput) depends on channel quality
ARQcontrol
Channelencoder Channel Channel decoder
(error detection)ARQ
control
Error informationBackward channel
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 9Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Introduction
Application: secure data transmission via wave guides or radio channels (especially mobile radio channels), secure data storage
Father of information theory − Claude E. Shannon: Using channel coding the error probability can be reduced to an arbitrarily small value, if the data rate is smaller than the channel capacity. (Shannon does not present a construction method.)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 10Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Introduction
Basic idea of channel coding: Insertion of redundancy Goal: error detection or error correction Block encoding process
Input vector: u = (u1, ... , uk) Output vector: x = (x1, ... , xn) Code rate:
nkR =C (1)
Length k Length n
Input vector u Output vector x
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 11Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Introduction
Code cube with n = 3, k = 3
Uncoded transmission: RC = 1 Smallest distance between code words: dmin = 1 Error detection or error correction not possible
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 12Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Introduction
Code cube with n = 3, k = 2
Coded transmission: RC = 2/3 Smallest distance between code words: dmin = 2 Detection of a single error is possible – error correction is not
possible
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 13Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Introduction
Code cube with n = 3, k = 1
Coded transmission: RC = 1/3 Smallest distance between code words: dmin = 3 Detection of two errors and correction of a single error is possible
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 14Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Information theory: mathematical description of information and of its transmission
Central questions: Quantitative calculation of the information content of messages Calculation of the capacity of transmission channels Analysis and optimization of source and channel coding
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 15Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Messages from the point of view of information theory
irrelevant relevant
errorinformation
redundant non redundant
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 16Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Information content of a message: entropy Qualitative ordering of messages: a message is more
important if it is more difficult to predict it Example:
Tomorrow, the sun rises. Tomorrow, there will be bad weather. Tomorrow, there will be a heavy
thunderstorm so that the electric power network will break down.
Messages from a digital source: sequence of symbols
Importance
Probability
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 17Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Number of binary decisions that is needed to select a message (symbol) from a source: H0 Total number of symbols: N
H0 = ld(N) bit/symbol
with ld(x) = logarithm to base 2 (logarithmus dualis) Unit: bit = binary digit Example: English text as a source
26 ⋅ 2 characters, 14 special characters including „space“: ‘ ’ “ ” ( ) - . , ; : ! ?
In total: 66 characters H0 = ld(66) = ln(66) / ln(2) = 6.044 bit/character
Coding Theory Information theory
(2)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 18Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Example: one page of English text with 40 lines and 70 characters per line Number of different pages: N = 6640⋅70
Number of binary decisions to select a page: H0 = ld(6640⋅70) = 40 ⋅ 70 ⋅ ld (66) = 16.92 kbit/page
The number of binary decisions H0 does not take into account the probability of symbols!
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 19Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Source alphabet: X ∈ {x1, ... , xN} Probabilities of the symbols: p(x1), ... , p(xN) Desired properties of the information content I = f(p):
I(xi) ≥ 0 for 0 ≤ p(xi) ≤ 1 I(xi) → 0 for p(xi) → 1 I(xi) > I(xj) for p(xi) < p(xj) Two subsequent statistically independent symbols xi and xj
with p(xi,xj) = p(xi) ⋅ p(xj) :I(xi,xj) = I(xi) + I(xj)
General solution: I(xi) = −k ⋅ logb(p(xi)) (3)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 20Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Definition of the information content:
Entropy H(X) = average information content of a source:
bit/symbol)(
1ld)(
=
ii xp
xI
bit/symbol)(
1ld)(
)()()()(
1
1
∑
∑
=
=
⋅=
⋅==
N
i ii
N
iiii
xpxp
xIxpxIXH
(4)
(5)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 21Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Example: binary source with X ∈ {x1, x2} Probabilities: p(x1) = p , p(x2) = 1 − p Entropy: H(X) = −p ld(p) − (1 − p) ld(1 − p) Shannon function: H(X) = S(p)
(6)
0.0
0.2
0.4
0.6
0.8
1.0
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
S(p) / bit
p
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 22Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
The entropy becomes maximum for equiprobable symbols: ⇒ H(X) ≤ H0 = ld N. Proof:
Using the relation:
∑
∑∑
=
==
⋅⋅=
⋅−⋅=−
N
i ii
N
ii
N
i ii
Nxpxp
Nxpxp
xpNXH
1
11
)(1ld)(
ld)()(
1ld)(ld)(
1ln −≤ xx
(7)
(8) -1.0
0.0
1.0
1.0 2.0 x
ln(x)
x − 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 23Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
with
Redundancy of a source: RS = H0 − H(X)
0112ln
1)(12ln
1
1)(
12ln
1)(ld)(
1
1
=
−⋅=
−≤
−
⋅⋅≤−
∑
∑
=
=
NNxp
N
NxpxpNXH
N
ii
N
i ii
)1(2ln
1ld1ln −≤⇒−≤ xxxx
(9)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 24Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Transmission via a binary channel:design of binary coding schemes for a discrete source
Tasks of source coding: Assignment of binary code words of length L(xi) to all symbols xi
Minimizing the average length of the code words
∑=
⋅==N
iiii xLxpxLL
1)()()(
L x1 1001x2 011
xN 010111L(xN)
....
....
(10)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 25Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Examples for binary coding of symbols: ASCII-Code: fixed code word length L(xi) = 8 (block code)Morse code (alphabet with dots, dashes and pauses for the
separation of code words): more frequently occuring characters are assigned to shorter code words
Prefix condition for codes with variable length: no codeword equals the prefix of any other longer codeword Example for a code without prefix condition:
x1 → 0x2 → 01x3 → 10x4 → 100
Unique decoding of a bit sequence is not possible!
Possible decoding results for the sequence 010010: x1x3x2x1, x2x1x1x3, x1x3x1x3, x2x1x2x1,
x1x4x3
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 26Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Example for a code with prefix condition:
The code is uniquely and instantaneously decodable! Decoding of the sequence 010010110111100:
x1x2x1x2x3x4x2x1
Synchronization: begin of the sequence
x1 → 0x2 → 10x3 → 110x4 → 111
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 27Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Decoding with a binary tree:
x1 → 0x2 → 10x3 → 110x4 → 111
Level 1: L(x1) = 1
0 1
0 1
0 1Level 2: L(x2) = 2
Level 3: L(x3) = L(x4) = 3
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 28Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Kraft inequality: a binary code with the code word lengths L(x1), L(x2), ... , L(xN) fulfilling the prefix condition exists only, if:
Proof: Length of the tree structure = maximum code word length
Lmax = max(L(x1), L(x2), ... , L(xN)) A code word in level L(xi) eliminates of all
possible code words in level Lmax . Sum over all eliminated code words ≤ maximum number in the
level Lmax
Coding Theory Information theory
121
)( ≤∑=
−N
i
xL i
)(max2 ixLL −
maxmax 221
)( LN
i
xLL i ≤∑=
−
(11)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 29Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
The equality holds, if all ends of the code tree are used by code words.
General relation: Example for the special case that all probabilities are given by
powers of 2:
Assignment of code words corresponding to the relation:L(xi) = Ki
Special case:
Coding Theory Information theory
( ) iKixp 2
1)( =
)(XHL ≥
815)( == XHL
xi p(xi) code words x1 1/2 1 x2 1/4 00 x3 1/8 010 x4 1/16 0110 x5 1/16 0111
(12)
(13)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 30Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Shannons coding theorem: For any source a binary coding can be found with :
Proof: left-hand side: average length of code words ≥ average information content (entropy)right-hand side: selection of code words with
I(xi) ≤ L(xi) ≤ I(xi) + 1Multiplication with p(xi) and summing up over all i ⇒ (14)
1)()( +≤≤ XHLXH (14)
(15)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 31Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Proof, that a code with the prefix condition (11) exists:left-hand side of (15):
Inserting in (11):
Shannon Coding Code word length corresponding (15): I(xi) ≤ L(xi) ≤ I(xi) + 1 Accumulated probabilities:
Coding Theory Information theory
1)(211
)( ∑∑==
− =≤N
ii
N
i
xL xpi
)()(
1 2)()(ld)( ii
xLiixpi xpxLxI −≥⇒≤
=
∑−
==
1
1)(
i
jji xpP
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 32Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Code words are decimals of Pi in binary notation:
Example:0.90 = 1⋅2−1 + 1⋅2−2 + 1⋅2−3 + 0⋅2−4 + 0⋅2−5 + 1⋅2−6 + 1⋅2−7 + 0⋅2−8 +0⋅2−9 + 1⋅2−10 + ...
Coding Theory Information theory
i p(xi) I(xi) L(xi) Pi code 1 0.22 2.18 3 0.00 000 2 0.19 2.40 3 0.22 001 3 0.15 2.74 3 0.41 011 4 0.12 3.06 4 0.56 1000 5 0.08 3.64 4 0.68 1010 6 0.07 3.84 4 0.76 1100 7 0.07 3.84 4 0.83 1101 8 0.06 4.06 5 0.90 11100 9 0.04 4.64 5 0.96 11110
bit54.3=Lbit97.2)( =XH
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 33Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Tree representation of a Shannon code:
Disadvantage: not all ends of the tree are used for code words
Coding Theory Information theory
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 34Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Lengths of code words can be reduced ⇒ the code is not optimum Redundancy of a code: Redundancy of a source:
Huffman Coding Recursive procedure Starting point: symbols with smallest probabilities Same code word lengths for the two symbols with smallest
probabilities Huffman coding minimizes the average code word length
Coding Theory Information theory
)(C XHLR −=)(0Q XHHR −=
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 35Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Algorithm:
Step 1: Ordering of symbols corresponding to their probability
Step 2: Assignment of 0 and 1 to the two symbols with lowest probability
Step 3: Combination of the two symbols with lowest probability xN−1 and xN to a new symbol with probability p(xN−1 ) + p( xN)
Step 4: Repetition of steps 1 - 3, until only one symbol is left
Example
Coding Theory Information theory
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 36Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
x1 x2 x3 x4 x5 x6 x7 x8 x9 0.22 0.19 0.15 0.12 0.08 0.07 0.07 0.06 0.04
0 1
x1 x2 x3 x4 x8 x9 x5 x6 x7 0.22 0.19 0.15 0.12 0.10 0.08 0.07 0.07
0 1 0 1
x1 x2 x3 x6 x7 x4 x8 x9 x5 0.22 0.19 0.15 0.14 0.12 0.10 0.08
0 1 00 01 1
x1 x2 x8 x9 x5 x3 x6 x7 x4 0.22 0.19 0.18 0.15 0.14 0.12
00 01 1 00 01 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 37Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
x6 x7 x4 x1 x2 x8 x9 x5 x3 0.26 0.22 0.19 0.18 0.15
00 01 1 000 001 01 1
x8 x9 x5 x3 x6 x7 x4 x1 x2 0.33 0.26 0.22 0.19
000 001 01 1 00 01 1 0 1
x1 x2 x8 x9 x5 x3 x6 x7 x4 0.41 0.33 0.26
0 1 0000 0001 001 01 100 101 11
x8 x9 x5 x3 x6 x7 x4 x1 x2 0.59 0.41
00000 00001 0001 001 0100 0101 011 10 11
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 38Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Tree structure of the Huffman code example
Coding Theory Information theory
bit01.3=L
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 39Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Discrete source without memory Joint entropy of sequences of symbols
Two independent symbols: p(xi,yk) = p(xi) ⋅ p(yk)
Coding Theory Information theory
[ ]
)()(),(
))((ld)()())((ld)()(
))((ld))((ld)()(
)),((ld),(),(
1´111
1 1
1 1
YHXHYXH
ypypxpxpxpyp
ypxpypxp
yxpyxpYXH
N
kkk
N
ii
N
iii
N
kk
N
i
N
kkiki
N
i
N
kkiki
+=
⋅⋅−⋅⋅−=
+⋅⋅−=
−=
∑∑∑∑
∑ ∑
∑ ∑
====
= =
= =
(17)
(16)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 40Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
H(X1,...,XM) ≤ ≤ H(X1,...,XM) + 1
M⋅H(X) ≤ ≤ M⋅H(X) + 1
H(X) ≤ ≤ H(X) + 1/M
M independent symbols from the same source:
H(X1, X2, ... , XM) = M ⋅ H(X)
More efficient coding by coding of symbol sequences Shannon coding theorem:
Disadvantage of coding of symbol sequences: rapidly increasing computational effort (exponential increase of number of combined symbols)
Coding Theory Information theory
),,( 1 MM XXL
LM ⋅
L
(18)
(19)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 41Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Example: Coding of symbol sequences Binary source with X = {x1, x2} Probabilities: p(x1) = 0.2, p(x2) = 0.8 Entropy: H(X) = 0.7219 bits/symbol Coding of single symbols:
Average code word length:
Coding Theory Information theory
symbol p(xi) code L(xi) p(xi)⋅L(xi) x1 0.2 0 1 0.2 x2 0.8 1 1 0.8
Σ = 1
bit/symbol 1=L
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 42Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding of pairs of symbols:
Average code word length:
Coding Theory Information theory
pair of symbols
p(xi) code L(xi) p(xi)⋅L(xi)
x1x1 0.04 101 3 0.12 x1x2 0.16 11 2 0.32 x2x1 0.16 100 3 0.48 x2x2 0.64 0 1 0.64
Σ = 1.56
bit/symbol 78.0=L
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 43Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding of triplets of symbols:
Average Code word length:
Coding Theory Information theory
triplet of symbols
p(xi) code L(xi) p(xi)⋅L(xi)
x1x1x1 0.008 11111 5 0.040 x1x1x2 0.032 11100 5 0.160 x1x2x1 0.032 11101 5 0.160 x1x2x2 0.128 100 3 0.384 x2x1x1 0.032 11110 5 0.160 x2x1x2 0.128 101 3 0.384 x2x2x1 0.128 110 3 0.384 x2x2x2 0.512 0 1 0.512
Σ = 2.184
bit/symbol 728.0=L
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 44Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Discrete source with memory Real sources: correlation between individual symbols Two correlated symbols: p(xi,yk) = p(xi) ⋅ p(yk|xi) = p(yk) ⋅ p(xi|yk)
Coding Theory Information theory
[ ]
)|()(),(
))|((ld),())((ld)()|(
))|((ld))((ld)|()(
)),((ld),(),(
1 11 1
1 1
1 1
XYHXHYXH
xypyxpxpxpxyp
xypxpxypxp
yxpyxpYXH
N
i
N
kikki
N
i
N
kiiik
N
i
N
kikiiki
N
i
N
kkiki
+=
⋅−⋅⋅−=
+⋅⋅−=
−=
∑ ∑∑ ∑
∑ ∑
∑ ∑
= == =
= =
= =
(20)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 45Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
H(Y | X) = conditional entropy
Entropy of a symbol ≥ conditional entropy
H(Y) ≥ H(Y | X )
Proof:
Coding Theory Information theory
(21)∑ ∑= =
⋅−=N
i
N
kikki xypyxpXYH
1 1))|((ld),()|(
(22)
∑ ∑
∑ ∑∑
= =
= ==
⋅=−
⋅−=⋅−=
N
i
N
k ik
kki
N
i
N
kkki
N
kkk
xypypyxpYHXYH
ypyxpypypYH
1 1
1 11
)|()(ld),()()|(
))((ld),())((ld)()(
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 46Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Using the inequality:
with p(xi,yk) = p(xi) ⋅ p(yk|xi)
Coding Theory Information theory
∑ ∑= =
−⋅⋅≤−
N
i
N
k ik
kki xyp
ypyxpYHXYH1 1
1)|(
)(2ln
1),()()|(
)1(2ln
1ld1ln −≤⇒−≤ xxxx
0),()()(2ln
1)()|(1 11 1
=
−≤− ∑ ∑∑ ∑
= == =
N
i
N
kki
N
i
N
kki yxpypxpYHXYH
= 1 = 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 47Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Entropy of a source with memory < entropy of a source without memory
Very efficient source coding for sequences of symbols with memory
General description of a discrete source with memory as a Markov source Markov processes:
Sequence of random variables z0, z1, z2, ... ,zn, (n = time axis) zi and zj are statistically independent:
Coding Theory Information theory
)(),...,,|( 021,...,,| 021 nznnnzzzz zfzzzzfnnnn
=−−−− (23)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 48Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
zi and zj are stat. dependent (Markov process of mth order):
Often: first order Markov process (m = 1):
zi are from a limited number of discrete values: zi ∈ {x1, ... , xN}
⇒ Markov chain Complete description of a Markov chain by transition
probabilities:
Coding Theory Information theory
)|(),...,,|( 1|021,...,,| 1021 −−− −−−= nnzznnnzzzz zzfzzzzf
nnnnn
),...,|(),...,|(101 101 mnnn imninjniinjn xzxzxzpxzxzxzp
−−−======= −−−
(25)
(26)
),...,|(),...,,|( 1,...,|021,...,,| 1021 mnnnzzznnnzzzz zzzfzzzzfmnnnnnn −−−− −−−−
=
(24)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 49Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Homogeneous Markov chain: transition probabilities do not depend on time:
stationary Markov chain: steady-state does not depend on initial probabilities
Homogeneous and stationary Markov chain of first order (m = 1):
Coding Theory Information theory
jjjnkn
ikjnkn
wxpxzpxzxzp ====== +∞→
+∞→
)()(lim)|(lim
ijinjn pxzxzp === − )|( 1
(28)
(29)
),...,|(),...,|(11 11 mm imkikjkimninjn xzxzxzpxzxzxzp ======= −−−−
(27)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 50Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Transition matrix:
Properties of the transition matrix:
Probability vector:
Calculation of w utilizing the steady state condition:
Coding Theory Information theory
11
=∑=
N
jijp
))()()(()( 2121 NN xpxpxpwww ==w
Pww =
(31)
(32)
(33)
=
NNNN
N
N
ppp
pppppp
21
22221
11211
P (30)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 51Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Stationary Markov source: first order Markov chain Entropy of a stationary Markov source = steady state entropy:
Coding Theory Information theory
∑ ∑
∑ ∑
∑
= =−−
= =−−
=−−
−−−∞→
∞
==⋅==⋅−=
==⋅==−=
=⋅===
==
N
i
N
jinjninjni
N
i
N
jinjninjn
N
iinniiinn
nnnnnn
xzxzpxzxzpw
xzxzpxzxzp
xzzHwxzzH
zzHzzzzHZH
1 111
1 111
111
1021
))|((ld)|(
))|((ld),(
)|()|(
)|(),,,|(lim)(
(37)
(34)
(35)
(36)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 52Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
⇒ Entropy H∞(Z) = conditional entropy, since symbols from the past are already known
Coding of a Markov source: Taking into account the memory e. g. Huffman coding which takes into account the
instantaneous state of the source
Fundamental problem for variable length source codes: catastrophic error propagation
Coding Theory Information theory
)()()|()( 01 nnnn zHzHzzHZH ≤≤= −∞ (38)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 53Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Example of a Markov source: States = transmitted symbols
zn ∈ {x1, x2, x3}
Transition probabilities:
Coding Theory Information theory
( ) ( )
===== −
3.01.06.02.05.03.04.04.02.0
)|( 1 Pijinjn pxzxzp
zn zn−1
x1 x2 x3
x1 0.2 0.4 0.4 x2 0.3 0.5 0.2 x3 0.6 0.1 0.3
x1 x2
x3
0.20.3
0.4
0.2
0.1
0.4
0.5
0.3
0.6
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 54Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Calculation of steady-state probabilities with:
w1 = 0.2 w1 + 0.3 w2 + 0.6 w3
w2 = 0.4 w1 + 0.5 w2 + 0.1 w3
w3 = 0.4 w1 + 0.2 w2 + 0.3 w3
1 = w1 + w2 + w3
Linear dependence ! Solution:
Coding Theory Information theory
1 and 1
== ∑=
N
iiwPww
3011.03441.03548.0 9328
39332
29333
1 ≈=≈=≈= www
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 55Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Calculation of the entropy with (35):
Inserting numbers:
Coding Theory Information theory
∑ ∑
∑
= =
=−∞
⋅⋅−=
=⋅=
N
i
N
jijiji
N
iinni
ppw
xzzHwZH
1 1
11
)(ld
)|()(
(39)
symbolbit / 2955.1ld3.0ld1.0ld6.0)|(
symbolbit / 4855.1ld2.0ld5.0ld3.0)|(
symbolbit / 5219.1ld4.0ld4.0ld2.0)|(
3.01
0.11
0.61
31
2.01
0.51
0.31
21
4.01
0.41
0.21
11
≅⋅+⋅+⋅==
≅⋅+⋅+⋅==
≅⋅+⋅+⋅==
−
−
−
xzzH
xzzH
xzzH
nn
nn
nn
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 56Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Entropy:
For comparison:
Coding Theory Information theory
symbol/bit441.1)|()|()|()( 313212111
≅=+=+== −−−∞ xzzHwxzzHwxzzHwZH nnnnnn
symbol/bit5814.11ld)(lim)(1
≅⋅== ∑=∞→
N
i iin
n wwzHZH
symbol/bit5850.13ld0 ≅=H
(40)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 57Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
State-dependent Huffman coding for the example: different codings for each state zn−1
pij code words
average code word length:
Coding Theory Information theory
zn zn−1
x1 x2 x3 ⟨L⟩|zn−1
x1 11 10 0 1.6 x2 10 0 11 1.5 x3 0 11 10 1.4
symbol/bit5054.1
)|()|(1 1
11
≅
==⋅⋅==== ∑ ∑= =
−−N
i
N
jinjnijiinjn xzxzLpwxzxzLL
(41)
zn zn−1
x1 x2 x3
x1 0.2 0.4 0.4 x2 0.3 0.5 0.2 x3 0.6 0.1 0.3
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 58Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Source coding without knowledge of statistical parameters Run-length coding
Replacing sequences of repeated symbols with the count and only one single symbol
Example: Source sequence:
aaaabbbccccccccdddddeeeeeaaaaaaabddddd..... Encoded sequence:
4a3b8c5d5e7a1b5d..... Encoding with dictionaries
Idea: repetitions within the data sequence are replaced by (shorter) references to the dictionary
Coding Theory Information theory
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 59Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Static dictionary: little compression for most data sources poor matching of the dictionary to particular data
Semi-adaptive dictionary dictionary tailored for the message to be encoded dictionary has to transmitted via the channel two passes over the data:
» to build up the dictionary» to encode the data
Adaptive dictionary single pass for building up the dictionary and encoding Lempel Ziv algorithm: *.zip *.gzip files
Coding Theory Information theory
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 60Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Lempel Ziv algorithm (LZ77)
Search for the longest match between the symbols of the look-ahead buffer and a sequence within the search buffer
Fixed-length codewords contain: position of match (counting from 0) length of match next symbol in the look-ahead buffer
Coding Theory Information theory
search buffer look-ahead buffer
..... a b c b a a b c b d f e e a a b a a c c d d d c d .....
sliding window
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 61Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Parameters: symbol alphabet: {x0,x1,x2, .... , xα−1} input sequence: S = {S1,S2,S3,S4, ....} length of look-ahead buffer: LS
window size: n
Codewords: Ci = {pi,li,Si} position of match: pi
length of match: li
next symbol: Si
length of codewords: same alphabet for data as well as codewords
Coding Theory Information theory
1)(log)(log SSC ++−= LLnL αα
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 62Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Example: symbol alphabet: {0,1,2} input sequence: S = {0010102102102120210212001120.....} length of look-ahead buffer: LS = 9 window size: n = 18 Step 1:
Step 2:
Coding Theory Information theory
0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 2 1 0 2 1 0 2 . . . .
C1 = {22 02 1}
0 0 0 0 0 0 0 0 1 0 1 0 2 1 0 2 1 0 2 1 2 0 . . . .
C2 = {21 10 2}
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 63Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Step 3:
Step 4:
Number of encoded source symbols after 4 steps: 3 + 4 + 8 + 9 = 24
Number of code word symbols after 4 steps: 4 × 5 = 20
Coding Theory Information theory
0 0 0 0 1 0 1 0 2 1 0 2 1 0 2 1 2 0 2 1 0 2 . . . .
C3 = {20 21 2}
2 1 0 2 1 0 2 1 2 0 2 1 0 2 1 2 0 0 1 1 2 0 . . . .
C4 = {02 22 0}
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 64Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Decoding: Step 1: C1 = {22 02 1}
Step 2: C2 = {21 10 2}
Step 3: C3 = {20 21 2}
Step 4: C4 = {02 22 0}
Coding Theory Information theory
0 0 0 0 0 0 0 0 0 0 0 1
0 0 0 0 0 0 0 0 1 0 1 0 2
0 0 0 0 1 0 1 0 2 1 0 2 1 0 2 1 2
2 1 0 2 1 0 2 1 2 0 2 1 0 2 1 2 0 0
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 65Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Transmission of information via a discrete memoryless channel Noiseless channel:
information at the output = information at the input ⇒ transmitted information = entropy of the source
Noisy channel: information at the output < information at the input ⇒ transmitted information < entropy of the source
Definition: average mutual information = actually transmitted information
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 66Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Discrete memoryless channel (DMC)
Input signal: Output signal:
},...,,{ 21 XNxxxX ∈
},...,,{ 21 YNyyyY ∈
discretememoryless
channel
X Y
p11
p21
p12
p22
x1
y2
xNX
x2
y1
yNYpNXNY
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 67Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Transition matrix:
with
Input probabilities:
Output probabilities:
Relation between input and output probabilities:
( ) ( )
=====
YXXX
Y
Y
NNNN
N
N
ijij
ppp
pppppp
pxXyYp
21
22221
11211
)|(P
11
=∑=
N
jijp
(42a)
(42b)
Ppp ⋅= XY
( ))(),...,(),( 21 XNX xpxpxp=p
( ))(),...,(),( 21 YNY ypypyp=p
(42c)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 68Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Chain of two channels:
Output probabilities:
Resulting transition matrix:
(45))()( 2121 PPpPPpp ⋅⋅=⋅⋅= XXZ
(46)
P1
X Y P2Z
21 PPP ⋅=
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 69Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Binary channel:
Probabilities at input and output:
Error probability of a binary channel:
212121
212121)()(
)|()()|()()error(pxppxp
xypxpxypxpp⋅+⋅=
⋅+⋅=
=
2221
1211pppp
P
(47)
(48)
⋅=
22211211
2121 ))(),(())(),(( ppppxpxpypyp
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 70Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Binary symmetric channel (BSC)
Error probability:
errerr21
212121)]()([
)()()error(ppxpxp
pxppxpp=⋅+=
⋅+⋅=
−
−=
errerr
errerr1
1pp
ppP (49)
(50)
y2
y11−perr
perr
perr
x1
x21−perr
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 71Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Example for mutual information – qualitative consideration: Transmission of 1000 binary statistically independent and
equiprobable symbols (p(0) = p(1) = 0.5) Binary symmetric channel with perr = 0.01 Average number of correctly received symbols: 990 But: T(X,Y) < 0.99 bit/symbol Reason: exact position of errors is not known
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 72Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Definitions of entropy at a discrete memoryless channel: Input entropy = average information content of the input
symbols:
Output entropy = average information content of the output symbols:
Joint entropy = average information content (uncertainty) of the whole transmission system:
)(1ld)()(
1 i
N
ii xp
xpXHX
∑=
⋅=
)(1ld)()(
1 j
N
jj yp
ypYHY
∑=
⋅=
∑ ∑= =
⋅=X YN
i
N
j jiji yxp
yxpYXH1 1 ),(
1ld),(),(
(51a)
(51b)
(51c)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 73Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Conditional entropy H(Y|X) = average information content at the output for known input symbols = entropy of the irrelevance
Conditional entropy H(X|Y) = average information content at the input for known output symbols = entropy of the information that is lost in the channel = entropy of the equivocation
∑ ∑= =
⋅=X YN
i
N
j ijji xyp
yxpXYH1 1 )|(
1ld),()|(
∑ ∑= =
⋅=X YN
i
N
j jiji yxp
yxpYXH1 1 )|(
1ld),()|(
(51d)
(51e)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 74Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Relations between different types of entropy:
Average information flow (average mutual information):
)|()()|()(),(),( YXHYHXYHXHXYHYXH +=+==
)()|( XHYXH ≤
)()|( YHXYH ≤
),()()(
)|()(
)|()(),(
YXHYHXH
XYHYH
YXHXHYXT
−+=
−=
−=
(53)
(54)
(55)
(56a)
(56b)
(56c)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 75Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Information flow
source sourceencoder
sourcedecoder sink
H(U)H(X)
H(Y)
H(X|Y)
H(Y|X)
equivocation
irrelevance
information flow
)ˆ(UH
transmission channel
T(X,Y)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 76Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Examples – information flow: Ideal noiseless channel:
Entropies: H(X|Y) = 0, H(Y|X) = 0, H(X,Y) = H(X) = H(Y)T(X,Y) = H(X) = H(Y)
Useless, very noisy channel: p(xi,yj) = p(xi) ⋅ p(yj) = p(yj|xi) ⋅ p(xi) ⇒ p(yj|xi) = p(yj) ⇒ pij = pkj
Entropies: H(X|Y) = H(X), H(Y|X) = H(Y), H(X,Y) = H(X) + H(Y)T(X,Y) = 0
≠=
=jiji
pij for 0for 1
(57)
(58)
(59)
(60)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 77Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Example for mutual information – quantitative consideration: Transmission of 1000 binary statistically independent and
equiprobable symbols (p(0) = p(1) = 0.5) Binary symmetric channel with perr = 0.01
bit/symbol9192.0
)(11ld1
1ld)1())1()0((1
)|(1ld)|()(
)(1ld)(
)|()(),(
errerr
errerr
err
1 11
≅
−=
+
−−+−=
⋅⋅−⋅=
−=
∑ ∑∑= ==
pSp
pp
ppp
xypxypxp
ypyp
XYHYHYXTX YY N
i
N
j ijiji
N
j jj
(61)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 78Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Channel capacity The average mutual information (average information flow) depends on
the probability of the source symbols. Definition of channel capacity:
Channel capacity = maximum average mutual information flow ∆T = symbol period Unit for channel capacity: bit/s C depends on properties of the channel – it does not depend on the
source!
),(max1
)(YXT
TC
ixp∆= (62)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 79Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Definition of the average information flow ... Average information flow = entropy / time:
H′(X) = H(X) / ∆T
Average mutual information flow = average mutual information /
time:T′(X,Y) = T(X,Y) / ∆T
Decisions for selecting symbols / time: H0′(X) = H0(X) /∆T
(63)
(64)
(65)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 80Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Example: binary symmetric channel – BSC:
−
−+−
+−−+−−+
−+−+=
−=
errerr
errerr
err1err1err1err1
err1err1err1err1
11ld)1(1ld
211ld)21(
21ld)2(
)|()(),(
pp
pp
pppppppp
pppppppp
XYHYHYXT
(66)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 81Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Average mutual information
Maximum for p1 = p(x1) = 0.5
0.0
0.2
0.4
0.6
0.8
1.0
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 p(x1)
T(X,Y)
bit/s
ymbo
lperr = 0
perr = 0.1
perr = 0.2
perr = 0.3
perr = 0.5
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 82Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Channel capacity
0.0
0.2
0.4
0.6
0.8
1.0
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 perr
C⋅∆T / bit
)(11
1ld)1(1ld1),(max errerr
errerr
err)(
pSp
pp
pYXTTCixp
−=
−
−+−==∆⋅
(67)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 83Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Example: binary erasure channel – BEC
−
−=
err
err
err
err1
00
1pp
pp
P err1 pTC −=∆⋅
0.0
0.2
0.4
0.6
0.8
1.0
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 perr
C⋅∆T / bit
(69)(68)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 84Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Theorem of channel capacity (Shannon 1948) For any ε > 0 and any information flow of a source R smaller than
channel capacity C (R < C ), a binary block code of length n (nsufficiently large) can be found, so that the residual error probability after decoding in the receiver is smaller than ε .
Reverse statement: even with largest coding effort the residual error probability cannot decrease below a certain limit for R > C.
Proof of the theorem uses random block codes (random coding argument): Proof for an average over all codesMost known codes are bad codes
No rule for construction of codes
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 85Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Channel capacity: optimum for infinite code word length ⇒ infinite time delay, infinite complexity
More practical theorem related to the theorem of channel capacity by definition of Gallager‘s error exponent for a DMC with NX input symbols:
There exists always an (n,k) block code with RC = k / n ld NX < C ∆T, so that the word error probability is given by:
⋅−⋅−= ∑ ∑
=
+
=
+≤≤
Y X
i
N
j
sN
i
sijixps
xypxpRsRE1
1
1
11
C)(10
CG )|()(ldmaxmax)(
)(w CG2 REnP ⋅−<
(70)
(71)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 86Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Gallager‘s error exponent Properties:
EG(RC) > 0 for RC < C·∆TEG(RC) = 0 for RC ≥ C·∆T
Definition of R0(computational cut-off rate) :
R0 = EG(RC = 0) 0.0
0.2
0.4
0.6
0.8
0.0 0.2 0.4 0.6 0.8 1.0
EG(RC)
RC
R0
R0 C ·∆T
(72)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 87Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Computational cut-off rate R0:Maximum of EG(RC = 0) lies at s = 1
Comparison for s = 1:
⋅−=== ∑ ∑
= =
Y X
i
N
j
N
iiji
xpxypxpRER
1
2
1)(CG0 )|()(ldmax)0(
C01
2
1C
)(CG )|()(ldmax)( RRxypxpRRE
Y X
i
N
j
N
iiji
xp−=
⋅−−≥ ∑ ∑
= =
(73)
(74)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 88Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
R0-theorem There exists always an (n,k) block code with RC = k / n ld NX < C ∆T,
so that the word error probability (for maximum likelihood decoding) is given by:
No rules for construction of good codes Range of values for the code rate:
0 ≤ RC ≤ R0 PW is bounded by (75)R0 ≤ RC ≤ C ∆T an upper bound for PW is difficult to calculateRC > C ∆T PW cannot become arbitrarily small
)(w C02 RRnP −−< (75)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 89Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Information theory
Comparison between channel capacity and computational cut-off rate R0 :
0.0
0.2
0.4
0.6
0.8
1.0
0.0 0.1 0.2 0.3 0.4 0.5 perr
R0
C ∆T
bit/s
ymbo
l
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 90Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Goals: principles and examples for block codes Design of code words
Binary code words Redundant codes Code C = set of all code words
Code word c = (c0, c1, ... , cn-1) with c ∈ C Encoding is a memoryless assignment:
information word code word u = (u0, u1, ... , uk-1) → c = (c0, c1, ... , cn-1)
k information bits n code bits n ≥ k
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 91Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Identical codes: codes with the same code words
Equivalent codes: Codes, which become identical after a permutation of bits
General notation: (n,k,dmin)q block code
q = number of symbols (size of alphabet)
Code rate:
Number of code words: N = qk = 2k
1C ≤=nkR (76)
(77)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 92Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Systematic codes: Code word c = (u, p)
m = n-kparity-check bits
Non-systematic codes: information and parity-check bits cannot be separated
Linear block codes can always be converted into equivalent systematic codes
u0 u1 u2 u3 ... ... uk-1
↓ ↓ ↓ ↓ ↓ ↓ ↓
c0 c1 c2 c3 ... ... ck-1 ck ck+1 ... ... cn-1
(78)
Coding Theory Channel coding in digital communication systems
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 93Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Addition and multiplication in a binary number system (modulo 2):
Two binary vectors x and y with the same length Hamming distance:
dH(x,y) = number of different bits for x and y
Example: dH( 0 1 1 1 0 1 0 1,1 0 1 0 0 1 0 1 ) = 3
⊕ 0 10 0 11 1 0
⊗ 0 10 0 01 0 1
Coding Theory Channel coding in digital communication systems
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 94Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
(Hamming) weight of a vector x:
Example: wH(0 1 1 1 0 1 0 1) = 5
Hamming distance: dH(x,y) = wH(x + y)
Example: x = ( 0 1 1 1 0 1 0 1 )y = ( 1 0 1 0 0 1 0 1 )
x + y = ( 1 1 0 1 0 0 0 0 )wH(x + y) = 3
Coding Theory Channel coding in digital communication systems
(80)
0 fromdifferent bits ofnumber )(1
0H == ∑
−
=
n
iixw x (79)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 95Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Transmission via a binary channel:
y = x + e
e = error vector
Repetition code k = 1 information bit → n − 1 repetitions
2k = 2 code words: c1 = (0 0 0 ... 0) and c2 = (1 1 1 ... 1) Repetition code is a systematic code Simple decoding by majority vote (especially, if n is odd) (n − 1)/2 errors can be corrected, n − 1 errors can be detected
Coding Theory Channel coding in digital communication systems
(81)x y
e
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 96Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example for a repetition code: n = 5 ⇒ RC = 1/5
u1 = (0) → c1 = (0 0 0 0 0) and u2 = (1) → c2 = (1 1 1 1 1)
Transmission via a noisy channel:e1 = (0 1 0 0 1) and x1 = (1 1 1 1 1) y1 = x1 + e1 = (1 0 1 1 0) →
e2 = (1 1 0 1 0) and x2 = (0 0 0 0 0) y2 = x2 + e2 = (1 1 0 1 0) →
Two errors can be corrected, four detected
)1(ˆ1 =u
)1(ˆ 2 =u
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 97Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Parity check code k information bits, a single parity check bit (m = 1) → n = k + 1
2k code words
c = (u , p) with p = u0 + u1 + u2 + ... + uk−1
(number of ones in code words c is even)
The parity check code is a systematic code.
No error can be corrected, an odd number of errors can be detected. Parity check:
s0 = y0 + y1 + y2 + ... + yn−1 = 0 → no errors0 = y0 + y1 + y2 + ... + yn−1 = 1 → error
(82)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 98Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example: k = 3
y1 = ( 0 1 1 0) → no error y2 = ( 1 1 1 0) → error
code word information bits
Parity check bit
c0 000 0 c1 001 1 c2 010 1 c3 011 0 c4 100 1 c5 101 0 c6 110 0 c7 111 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 99Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Hamming code A single error in a codeword can be corrected. Example: (7,4) Hamming code
Parity check bits:c4 = c0 + c1 + c2
c5 = c0 + c1 + c3
c6 = c0 + c2 + c3
u0 u1 u2 u3
↓ ↓ ↓ ↓
c0 c1 c2 c3 c4 c5 c6
(83a)(83b)(83c)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 100Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Matrix representation of block codesx = u G
G = generator matrix (k × n matrix)
Systematic block codes:
G = [Ik P]
Ik = identity matrix k × k, P = parity bit matrix k × (n − k)
Example: (7,4) Hamming code
(84)
(85)
=
110|101|011|111|
1000010000100001
G (86)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 101Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
(7,4)-Hamming code continued
Calculation of code words by matrix multiplication
Number of code words: 2k = 16
Number of all possible receiver input words: 2n = 128
( ) ( )
110|101|011|111|
1000010000100001
001|10011001
u x
G
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 102Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Table of code words
code word information bits
parity check bits
c8 1000 111 c9 1001 100 c10 1010 010 c11 1011 001 c12 1100 001 c13 1101 010 c14 1110 100 c15 1111 111
code word information bits
parity check bits
c0 0000 000 c1 0001 011 c2 0010 101 c3 0011 110 c4 0100 110 c5 0101 101 c6 0110 011 c7 0111 000
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 103Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Properties of linear block codes Each code word is a linear combination of rows of G.
The code is given by all linear combinations of rows of G. The sum of code words is again a code word. Any linear block code contains the all-zeros vector (0 0 0 ... 0).
⇒ A code is linear, if it can be constructed by matrix multiplication x = u G (u = arbitrary information vector).
Generator matrix: row vectors must be linearly independent.
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 104Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Elementary row operations do not change a code: Interchanging two rowsMultiplication of a row with a scalar (different from 0) Addition of a row to a second
The minimum distance between two code words equals the minimum weight:
Proof:
( )( ) minH
212121Hmin;|)(min
;,|),(minww
dd=≠∈=
≠∈=
0ccccccccc
CC
(87)
( )( )( )( )0ccc
0ccc0cccccc0
cccccc
≠∈=
≠∈=
≠∈+=
≠∈=
;|)(min;|),(min
;,|),(min;,|),(min
H
H
212121H
212121Hmin
CC
CC
wdddd
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 105Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Error correction for a (7,4) Hamming code: Evaluation of parity check equations:
s0 = y0 + y1 + y2 + y4
s1 = y0 + y1 + y3 + y5
s2 = y0 + y2 + y3 + y6
Syndrome : s = (s0 s1 s2)
The syndrome does not depend on the code word, only on the error.
Table for the assignment of error positions
(88a)(88b)(88c)
syndrome error position s0 s1 s2
no error 0 0 0 error at bit No. 0 1 1 1 error at bit No. 1 1 1 0 error at bit No. 2 1 0 1 error at bit No. 3 0 1 1 error at bit No. 4 1 0 0 error at bit No. 5 0 1 0 error at bit No. 6 0 0 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 106Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Decoding of Hamming codes: Evaluation of parity check equations → syndrome Error position read from the syndrome table Error correction: add a „1“ at the error position
m = n − k parity check bits assign 2m − 1 error positions and the error-free transmission
Code word length: n = 2m − 1 Possible parameters for Hamming codes:
(2m − 1,2m − 1 − m) block code
(89)
m 2 3 4 5 6 7 8n 3 7 15 31 63 127 255k 1 4 11 26 57 120 247
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 107Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Parity check matrix Definition: c HT = 0 for all c ∈ C
and x HT ≠ 0 for all x ∉ C
Properties of the parity check matrix:
0 = c HT = (u G) HT = u (G HT) → G HT = 0
→ Generator matrix and parity check matrix are orthogonal.
Elementary row operations for H are allowed.
Parity check matrix: H = (PT In − k) with G = [Ik P]
Dimensions of the parity check matrix H: (n − k) × n
(91)
(90)
(92)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 108Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Proof of orthogonality:
Dual code Code C : generator matrix G, parity check matrix H
Dual code Cd : generator matrix Gd = H, parity check matrix Hd = G
Code words of both codes are orthogonal:
with c = u G ∈ C and cd = v H ∈ Cd it follows:
(94)
(93)( ) 0PPIPPII
PPIHG =+=+=
= −
−knk
knk
T
( ) ( ) 0TTTTTd ==== v0uvHGuHvGucc
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 109Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example for dual codes:
Repetition code
Parity check code:
(95)
(96)
( )111|1 =G
=
10000
010001
||||
1
11
H
( )111|1 =H
=
10000
010001
||||
1
11
G
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 110Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Calculation of the syndrome using a matrix representation:s = y HT
Properties of the syndrome:
The syndrome is the all-zeros vector only, if y is a code word.
All error vectors e are detected which are not code words.
The syndrome does not depend on the code word:
s = y HT = (x + e) HT = e HT
(97)
(98)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 111Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example: (7,4) Hamming code
=
110|101|011|111|
1000010000100001
G
=
1101
1011
0111
P
== −
100
010
001
|||
110
101
011
111
)I( TknPH
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 112Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example: (7,4) Hamming code
s = y HT
( ) ⇔
−−−=
100010001
110101011111
610 yyy ss0 = y0 + y1 + y2 + y4s1 = y0 + y1 + y3 + y5s2 = y0 + y2 + y3 + y6
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 113Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Design of Hamming codes The syndrome depends only on the error vector:
s = e HT
A single error at the position i (ei = 1) leads to a syndrome, which equals the corresponding row of HT .
⇒ All rows of HT have to be different, so that the error position can be determined uniquely.
No row of HT must be the all-zeros vector. ⇒ The rows of HT / columns of H are built up by all possible
sequences except the all-zeros vector.
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 114Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Parity check matrix of a systematic Hamming code:
Example: (15,11) Hamming code
== −
10000
010001
||||
11
1101
)I( T
knPH
all possible column vectors with more than one „1“
=
1000010000100001
|110|101|011|000
1001010100111110
1101101101111111
H
(99)
(100)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 115Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Modifications of linear codes Extending of codes: additional parity check bits
n' > n, k' = k, m' > m, RC' < RC, dmin' ≥ dmin
Puncturing of codes: reducing the number of parity check bitsn' < n, k' = k, m' < m, RC' > RC, dmin' ≤ dmin
Lengthening of codes: additional information bitsn' > n, k' > k, m' = m, RC' > RC, dmin' ≤ dmin
Shortening of codes: reducing the number of information bits n' < n, k' < k, m' = m, RC' < RC, dmin' ≥ dmin
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 116Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example: Hamming code: A single error can be corrected by syndrome decoding. Two errors lead to: s ≠ 0
Extended Hamming code: Detection of a double error situation by an additional parity
check bit (2m,2m − 1 − m) block code Generator matrix of the extended Hamming code:
=
−1
1
0
HextH,
kσ
σσ
GG (101)
=
1110
0
HextH,
HH
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 117Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
σi are the sums of the rows of G: σi = gi,0 + gi,1 + ... + gi,n−1
Error events: No error: s = 0 A single error: s ≠ 0, sm+1 = 1 Two errors: s ≠ 0, sm+1 = 0
Decoding: s = 0 : receiver input vector = code word s ≠ 0, sm+1 = 1 : odd number of errors ⇒ a single error is
assumed and is corrected by evaluation of the syndrome s ≠ 0, sm+1 = 0 : even number of errors ⇒ errors cannot be
corrected
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 118Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Error correction and error detection Minimum distance between code words:
dmin = 1 : a single error can neither be corrected nor detected dmin = 2 : at least a single error can be detected dmin = 3 : at least a single error can be corrected and at least two
errors can be detected
( )C∈= 2121Hmin ,|),(min ccccdd
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 119Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Error correction and error detection
Number of errors that can be detected: te = dmin − 1
Number of errors that can be corrected: If dmin is even: t = (dmin − 2) / 2 If dmin is odd: t = (dmin − 1) / 2
dmin = 3 dmin = 4
(102)
(103)(104)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 120Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Singleton bound: minimum distance and minimum weight of a linear code are limited by:
Proof: systematic code word with a single information bit unequal 0 weight / distance in information bits: dH,information = 1 weight / distance in parity check bits: dH,parity ≤ m = n − k
A code, for which equality in (105) holds, is called maximum distance separable (MDS)
mknwd +=−+≤= 11minmin (105)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 121Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Error detection (102): te + 1 = dmin ≤ 1 + m ⇒ m ≥ te
At least one parity check bit is needed per error to be detected.
Error correction (103,104): (dmin − 1) / 2 ≥ t ≥ (dmin − 2) / 2
2 t + 1 ≤ dmin ≤ 1 + m ⇒ m ≥ 2 t
At least two parity check bits are needed per error to be corrected.
(106)
(107)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 122Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Decoding sphere n-dimensional sphere around a code word with radius t – all vectors
within the decoding spheres are decoded as the respective code word Total number of vectors within decoding spheres ≤ total number of
all possible vectors ⇒
Hamming bound For a binary (n,k) block code with error correction capability t the
following relation holds:
(108)nt
i
kin
220
≤
∑=
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 123Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Proof: Number of vectors around a code word with dH = 1:
Number of vectors around a code word with dH = 2:
Number of vectors around a code word with dH = t:
with
Total number of vectors within decoding spheres:
(109)
1n
tn
2n
1)1())1(()1(
⋅⋅−⋅−−⋅⋅−⋅
=
tttnnn
tn
nktnnnn
2210
2 ≤
++
+
+
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 124Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Perfect codes In (108) equality holds. All received words fall into decoding spheres. Only a few perfect codes are known.
Example: (7,4) Hamming code dmin = wmin = 3 ⇒ t = 1
In general: Hamming codes are perfect.
7
74
2128)71(16
217
07
2
==+⋅
≤
+
⋅
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 125Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Plotkin bound For a binary (n,k) block code with minimum distance dmin the
following relation holds:
Approximation:
Proof: minimum weight ≤ average weight Average weight of any bit of a code word: 1/2 Average weight of a code word of length n: n/2 Average weight without all-zeros vector:
(110)12
2 1min
−⋅≤
−
k
knd
12for2min >>≤ knd
122
2 −⋅ k
kn
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 126Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Gilbert-Varshamov bound There is a binary (n,k) block code with a minimum distance dmin , if
the parameters n, k, dmin are related by:
Bound for the existance of a code, which does not give a construction rule
(111)kn
d
i in −
−
=<
−∑ 2
12
0
min
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 127Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Proof: parity check matrix
c is a code word:
For each code word, there are at least wmin = dmin ones⇒ dmin columns of H depend linearly on each other⇒ (dmin − 1) columns of H are linearly independent
( )n
nknknkn
n
n
hhh
hhhhhh
S2S1S
,2,1,
,22,21,2
,12,11,1
hhhH
=
=
−−−
∑=
==⇒=n
iiic
1S
TT 0hcH0Hc
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 128Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
A column hSj and (dmin − 2) additional columns have to be linearly independent.
This is fulfilled, if the column hSj can create more different column vectors than can be created by linear combinations of the (dmin − 2) additional columns.
Number of different columns of hSj : 2n−k
Number of linear combinations with exactly i columns in (n − 1) columns:
Total number of linear combinations for 1 ... (dmin − 2) columns:
−i
n 1
knd
i in −
−
=<
−∑ 2
12
0
min
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 129Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Bounds of large code word lengths (n → ∞): Relation between code rate RC and normalized distance dmin/n
Singleton bound:
Hamming bound:
Plotkin bound:
Elias bound:
Gilbert-Varshamov bound:
(112)ndR min
C 1−≤
⋅−≤
ndSR min
C 211
ndR min
C 21 ⋅−≤
−−−≤
ndSR min
C 211211
−≥
ndSR min
C 1
(113)
(114)
(115)
(116)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 130Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Asymptotic bounds (n → ∞) between code rate RC and normalized distance dmin / n
0.0
0.2
0.4
0.6
0.8
1.0
0.0 0.2 0.4 0.6 0.8 1.0
RC=k/n
SingletonPlotkinEliasHamming
dmin / n
Gilbert-Varshamov
upperbounds
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 131Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Weight distribution Number of code words with weight i: Ai
Weight function: two representations
The weight function can be calculated analytically only for a few codes. Weight distribution of linear codes:
A0 = 1; An ≤ 1Ai = 0 for 0 < i < dmin
Some codes: symmetric weight distribution: Ai = An−i
),1(
)(0
zW
zAzAn
i
ii
=
= ∑=
)(
),(0
xyAx
yxAyxW
n
n
i
iini
=
= ∑=
−(117) (118)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 132Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example: (4,3) parity check code
code word information word
parity check bit
weight
c0 000 0 0 c1 001 1 2 c2 010 1 2 c3 011 0 2 c4 100 1 2 c5 101 0 2 c6 110 0 2 c7 111 1 4
4261)( zzzA ++=
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 133Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Weight distributions allows exact calculation of residual error probability
MacWilliams identity Given: (n,k) block code with weight function A(z) Weight function of the dual code:
respectively:
+−
⋅+⋅= −zzAzzA nk
11)1(2)(d
),(2),(d yxyxWyxW k −+⋅= −
+−
⋅+⋅= −−zzAzzA nkn
11)1(2)( d
)(
),(2),( d)( yxyxWyxW kn −+⋅= −−
(119)
(120)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 134Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example: Code C = {000,011,101,110} (parity check code)
generator matrix:
weight function: A(z) = 1 + 3 z2
Dual code Cd = {000,111} (repetition code)
generator matrix: Gd = ( 1 1 1 )
weight function:
=
110101
G
32
32d 1
1131)1(2)( z
zzzzA +=
+−
+⋅+⋅= −
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 135Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Channel coding and decoding
Strategies for decoding Goal: minimum word error probability
)ˆ()ˆ(W xxuu ≠=≠= ppp (121)
channelencoder
code wordestimator
assignmentof information
wordschannel
ex y xu u
channel decoder
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 136Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Possible transmission results: y = x error-free transmission
decoder result: y ≠ x, y ∈ C error into a different code word – cannot be
corrected decoder result:
y ≠ x, y ∉ C erroneous transmission – error can be detected and may be correcteddecoder result:
uuxx == ˆ,ˆ
uuxx ≠≠ ˆ,ˆ
uuxx
≠=
≠=
ˆ,ˆ
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 137Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Possible decoding results
Correct decoding:
Error-free channel
Errors are corrected in the right way
Erroneous decoding:
Error was corrected in a wrong way
Decoder failure:
An error cannot be corrected, since the decoder does not find a solution.
uuxx == ˆ,ˆ
uuxx ≠≠ ˆ,ˆ
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 138Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Word error probability for a DMC:
Probability of an error:
Assumption: transmitted code words are equiprobable
∑∈
⋅≠=
≠=
Cxxxxx
xx)sent ()sent |ˆ(
)ˆ(Wpp
pp
∑∑≠
=≠
=≠xx
xyxx
xyxxxyy ˆ|
)|(ˆ|
)sent |received ()sent |ˆ( ppp
(122)
(123)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 139Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Word error probability
has to be chosen such, that becomes maximum
∑
∑
∑ ∑∑ ∑
∑ ∑
⋅−=
⋅−=
−⋅=
⋅=
−
=∈
−
∈ =∈
−
∈
−
≠
y
xxyx
x xxyx y
x xxy
xy
xy
xyxy
xy
)ˆ|(21
)|(21
])|()|([2
2)|(
ˆ|,
ˆ|
ˆ|W
p
p
pp
pp
k
k
k
k
C
CC
C
x )ˆ|( xyp
(124)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 140Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Optimum decision strategies Knowledge of the receiver: y
Strategy of the receiver: maximize the a-posteriori probability p(x|y)
MAP(maximum a posteriori probability) decoder:
Search for a code word which maximizes p(x|y) xx ˆ=
C∈≥= xyxyxx allfor )|()|ˆ( pp (125)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 141Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Bayes theorem:
Assumption: equiprobable information / code words: (identical a-priori probabilities): p(xi) = 2−k
⇒
p(y|x) = likelihood function
A-posteriori probability and likelihood function exhibit their maximum at the same position
Maximum likelihood decoder:Search for the code word which maximizes p(y|x)
)()()|()|(
yxxyyx
pppp ⋅
=
)|()()|( 0 xyyyx pkp ⋅=
xx ˆ=
xx ˆ=
C∈≥= xxyxxy allfor )|()ˆ|( pp
(126)
(127)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 142Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Maximum likelihood decoder minimizes the word error probability
Maximum likelihood decoder for a BSC:
The estimation result is the code word with the minimum Hamming distance to the received vector
),(
err
errerr
),(err
),(err
1
0 err
err1
0H
HH1
)1()1(
forfor1
)|()|(
xyxyxy
xy
dnddn
n
i ii
iin
iii
ppppp
xypxyp
xypp
−
⋅−=⋅−=
≠=−
==
−
−
=
−
=∏∏
C∈≤ xxyxy allfor ),()ˆ,( HH dd
(128)
(129)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 143Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example: Code C = {c1,c2,c3,c4} = {00000,11100,00111,11011} dmin = 3 Example for maximum likelihood decoding:
y dH(y,c1) dH(y,c2) dH(y,c3) dH(y,c4) 00111 3 4 0 3 c3 10000 1 2 4 3 c1 11000 2 1 5 2 c2 10001 2 3 3 2 c1 or c4
x
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 144Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Maximum likelihood decoding (MLD): optimum method Large compuational effort Number of comparisons of vectors: 2k
More than t errors may be corrected
Bounded distance decoding (BDD) Sphere around each code word with radius t0, Decoding only those received vectors y which lie within a single
sphere Received vectors y which do not lie in a sphere or lie in more
than one sphere, are not decoded
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 145Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Bounded minimum distance decoding (BMD) Sphere around each code word with radius t, Decoding only those received vectors y which lie within spheres Decoding spheres are disjunct
MLD BDD BMD
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 146Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
More suboptimum decoding schemes: Syndrome decoding
simple realization suboptimum, since not all available information is utilized
Majority vote
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 147Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Probability of errors in code words Transmission channel: binary symmetric (memoryless) channel with
error probability perr
Average number of errors in a code word:
Probability of no error:
Probability of a single error at a certain position (e = e1, wH(e1) = 1):
errerror pnn ⋅=
nppppp )1()1()1()1()( errerrerrerr −=−⋅⋅−⋅−== 0e
n factors
1errerr1 )1()( −−⋅== nppp ee
(130)
(131)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 148Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Probability of a single error at any arbitrary position (e = e1, wH(e1) = 1):
Probability of ne errors at arbitrary positions (e = ene, wH(ene
) = ne):
Binomialcoefficients:
Binomial theorem:
1errerr1 )1()( −−⋅⋅== nppnp ee
eee
)1()( errerre
nnnn pp
nn
p −−⋅⋅
== ee
10
;)!(!
!=
=
−
=−⋅
=
nnn
knn
knkn
kn
nn
k
nknkn
k kn
babakn
2;)(00
=
+=
∑∑=
−
=
(132)
(133)
(134)
(135)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 149Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Binomial distribution of bit errors per code word for n = 7
0.0
0.2
0.4
0.6
0.8
1.0
0.0 1.0 2.0 3.0 4.0 5.0
p(wH(e) = ne)
ne
perr = 0,01
perr = 0,1perr = 0,3
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 150Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Error detection: undetected errors
linear block code C = {c0,c1,c2, ... ,c2k-1} with cν = (cν,0,cν,1, ... ,cν,n-1) and c0 = 0
An error remains undetected if e ∈ C and e ≠ 0
Probability of undetected errors pue:
Undetected error: e = cν
⇒ ej = cν,j = 0 at n − wH(cν) bit positions with p(ej = 0) = 1 − perr
⇒ ej = cν,j = 1 at wH(cν) bit positions with p(ej = 1) = perr
∑−
===≠∩∈=
12
1ue )()(
k
pppν
νce0ee C (136)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 151Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Probability of undetected errors pue:
Summation versus the weight of the code words:
with it results:
∑∑−
=
−−
=⋅−===
12
1
)(err
)(err
12
1ue HH)1()(
kkwwn pppp
ννν νν ccce
∑=
− ⋅−⋅=n
di
iini ppAp
minerrerrue )1(
∑=
−−⋅⋅+=
−
n
di
iii ppA
ppA
min
)1(11 errerr
err
err
−
−
−= 11
)1(err
errerrue p
pApp n
(137)
(138)
(139)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 152Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Approximation for small error probabilities:
Example: (7,4) Hamming code
Weight function: A(z) = 1 + 7 z3 + 7 z4 + z7
dmin = 3
pue = 7 (1 − perr)4 perr3 + 7 (1 − perr)3 perr
4 + perr7
= 7 perr3 − 21 perr
4 + 21 perr5 − 7 perr
6 + perr7 ≈ 7 perr
3
minmin
minerrerrerrue 1)( d
dn
di
ii pApApAp ⋅≈−=⋅≈ ∑
=(140)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 153Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Residual error probability
Linear block code C = {c0,c1,c2, ... ,c2k-1} with error correction capability t
Assumption: bounded minimum distance decoding (BMD)
Word error probability:
∑∑+==
===−=
+≥=≤−=
−=
n
ti
t
iiwpiwp
twptwp
pp
1H
0H
HH
BMDW,
))(())((1
)1)(())((1
)decodingcorrect (1
ee
ee
(141)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 154Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Probability of error vectors with weight wH(e) = i(Eq. (133)):
Word error probability with BMD:
Estimate for small error probabilities:
Word error probability for MLD:
∑∑+=
−
=
− −
=−
−=
n
ti
init
i
ini ppinppi
np1
errerr0
errerrBMDW, )1()1(1
(142)ini pp
in
iwp −−⋅⋅
== )1())(( errerrH e
(143)
(144)
BMDW,MLDW, pp ≤
1errBMDW, 1+
+
< tpt
np
(145)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 155Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Estimate for the bit error probability
The number of bit errors is limited to the number of information bits k and to i + t , since:
⇒
(146)
( )∑=
=⋅==
=
n
iiwpiw
k
kp
0HH
bit
))(()(| worddecodedper errorsbit ofNumber 1
worddecodedper errorsbit ofNumber 1
ee
)ˆ,(),()ˆ,()ˆ,( HHHH xyyxxxuu dddd +≤≤
= i ≤ t (BMD)
∑+=
=⋅+≤n
tiiwptik
kp
1Hbit ))((),min(1 e
(147)
(148)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 156Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Estimate for the bit error probability:
Estimate for small error probabilities:
(149)ini
n
tipp
in
ktip −
+=−⋅⋅
⋅
+
≤ ∑ )1(,1min errerr1
bit
(150)1
errmin
bit 1,1min +⋅
+
⋅
≤ tp
tn
kdp
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 157Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example: (7,4) Hamming code:
Equality in (145) holds since the code is perfect.
2err
2err
2errerrerr
3err
2errerr
6errerr
7err
6err
1err
7err
0errW
27
21
)61(7)2171(1
)1(7)1(1
)1(17
)1(07
1
pp
pppppp
ppp
ppppp
=≈
+−−−+−−=
−−−−=
−
−−
−=
(151)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 158Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Word error probablity for Hamming codes:
-12.0
-10.0
-8.0
-6.0
-4.0
-2.0
0.0
-6.0 -5.0 -4.0 -3.0 -2.0 -1.0 0.0 lg perr
n = 3n = 7n = 15n = 31n = 63n = 127
lg pW
pW = perr
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 159Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems Word error
probability and probability of undetected errors for Hamming codes:
-6 -5 -4 -3 -2 -1 0-12
-10
-8
-6
-4
-2
0
lg pW
lg perr
lg pue
pW = perr
pW = perr2
pW = perr3
n = 3
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 160Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Minimum distance of Hamming codes
All columns of the parity check matrix are different
⇒ two columns are linear independent
Three columns are linear dependent, e.g. 100000...., 010000.... and 110000....
⇒ dmin = 3, t = 1
Weight function:
−⋅+⋅++
+=
+−2
12
1
)1()1()1(1
1)(nn
n zznzn
zA (152)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 161Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Products of two vectors
Scalar product
Elementwise vector product
),,,,(,),,,,( 12101210 −− == nn yyyyxxxx yx
11221100T
−− ⋅++⋅+⋅+⋅=⋅ nn yxyxyxyx yx
),,,,( 11221100 −−=× nn yxyxyxyx yx
(153)
(154)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 162Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Sum construction of codes
Given: linear (n,k1,dmin,1) block code C1 and linear (n,k2,dmin,2) block code C2
Sum construction: linear (2n,k1+k2,dmin) block code C
Generator matrix:
Distance:
⇒ dmin = min(2dmin,1,dmin,2)
{ } 221121121 andwith),(& CCCCC ∈∈+== ccccc
=
2
11G0GG
G (155)
(156)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 163Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Proof:
⇒ dmin ≤ min(2dmin,1,dmin,2)
for c2 = 0 and c1 ≠ 0 : wH((c1,c1)) = 2 wH(c1) ≥ 2 dmin,1
for c2 ≠ 0 and c1 ≠ 0 :
wH((c1,c1 + c2)) = wH(c1) + wH(c1 + c2)
≥ wH(c1 + (c1 + c2)) = wH(c2) ≥ dmin,2
⇒ dmin ≥ min(2dmin,1,dmin,2)
CC ∈∈ ),0(and),( 211 ccc
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 164Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Reed Muller codes Sum construction
Parameters: r, m with 0 ≤ r ≤ m
(n,k,dmin) Reed Muller code RM(r,m) with
Recursion procedure:
RM(r + 1, m + 1) = RM(r + 1, m) & RM(r, m)
Start values: RM(0, m) = (2m, 1, 2m) repetition code
RM(m, m) = (2m, 2m, 1) uncoded
rmr
i
m dim
kn −
==
== ∑ 2,,2 min
0(157)
(158)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 165Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example for the construction of the generator matrix of the Reed Muller codes RM(r,m) :
RM(1, 2) = RM(1, 1) & RM(0, 1)
Alternative construction:
( )1101 =G
=
1001
11G
=
110010100101
12G
=
rG
GG
G1
0
(159)
(160)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 166Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
G0 = all-ones vector of length n = 2m: G0 = (1 1 1 1 ... 1)
G1 = m×2m matrix: columns contain all possible binary words of length m
Gl : vector products of all combinations of l row vectors of G1
Each row of G corresponds to an information bit
⇒
∑=
=
++
+
+=
r
i im
rmmm
k021
1 (161)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 167Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example: m = 4, r = 3 ⇒ n = 16
( )11111111111111110 =G
=
=
4Z
3Z
2Z
1Z
1
1010101010101010110011001100110011110000111100001111111100000000
gggg
G
(162)
(163)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 168Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
××××××
=
=
4Z3Z
4Z2Z
3Z2Z
4Z1Z
3Z1Z
2Z1Z
2
100010001000100010100000101000001100000011000000101010100000000011001100000000001111000000000000
gggggggggggg
G
××××××××
=
=
4Z3Z2Z
4Z3Z1Z
4Z2Z1Z
3Z2Z1Z
3
1000000010000000100010000000000010100000000000001100000000000000
gggggggggggg
G
(164)
(165)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 169Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Alternative construction of the example: m = 2, r = 1 ⇒ n = 4
=
101011001111
12G
=
110010100101
12G
00111111100101010110101011000000
111011101001110010100000
10010011010111110110110010100000
111011101001110010100000
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 170Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Minimum distance: calculation using the results of sum constructions
dmin = min(2dmin,1,dmin,2)
dmin (RM(r + 1, m + 1)) = min(2 dmin(RM(r + 1, m)), dmin,2(RM(r, m)))
Table: dmin (RM(r, m)) = 2m−r(166)(167)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 171Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
m r n 0 1 2 3 4 5
0 1 k = 1 dmin = 1
1 2 k = 1 dmin = 2
k = 2 dmin = 1
2 4 k = 1 dmin = 4
k = 3 dmin = 2
k = 4 dmin = 1
3 8 k = 1 dmin = 8
k = 4 dmin = 4
k = 7 dmin = 2
k = 8 dmin = 1
4 16 k = 1 dmin = 16
k = 5 dmin = 8
k = 11 dmin = 4
k = 15 dmin = 2
k = 16 dmin = 1
5 32 k = 1 dmin = 32
k = 6 dmin = 16
k = 16 dmin = 8
k = 26 dmin = 4
k = 31 dmin = 2
k = 32 dmin = 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 172Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Decoding by majority votes: example m = 4, r = 1 ⇒ n = 16, dmin = 8 ⇒ t = 3
=
=
10101010101010101100110011001100111100001111000011111111000000001111111111111111
1
0GG
G
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 173Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
x = u G ⇒
x0 = u0x1 = u0 + u4
x2 = u0 + u3x3 = u0 + u3 + u4
x4 = u0 + u2x5 = u0 + u2 + u4
x14 = u0 + u1 + u2 + u3x15 = u0 + u1 + u2 + u3 + u4
u4 = x0 + x1
u4 = x2 + x3
u4 = x4 + x5
u4 = x14 + x15
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 174Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Equations for other information bits:
u3 = x0 + x2 = x1 + x3 = x4 + x6 = x5 + x7 = x8 + x10 = x9 + x11 = x12 + x14 = x13 + x15
u2 = x0 + x4 = x1 + x5 = x2 + x6 = x3 + x7 = x8 + x12 = x9 + x13 = x10 + x14 = x11 + x15
u1 = x0 + x8 = x1 + x9 = x2 + x10 = x3 + x11 = x4 + x12 = x5 + x13 = x6 + x14 = x7 + x15
u0: majority votes for the elements of the vector:v1 = x + (0, u1, u2, u3, u4) G
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 175Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Cosets
Given: (n,k) block code C with parity check matrix H
Syndrome: s = y HT = (x + e) HT = e HT
Number of different error vectors e: 2n − 2k
Number of possible syndromes: 2n − k
Numbering of syndromes: sµ with 0 ≤ µ ≤ 2n−k − 1
Definition:
Coset = set Mµ of all error vectors e, which yield the same syndrome sµ : Mµ = {e | e HT = sµ} (169)
(168)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 176Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
coset leader = error vector eµ with the smallest Hamming weight
Coset leaders need not to be unique.
Property:
for e1, e2 ∈ Mµ it follows e1 + e2 ∈ C , since s = y HT = (e1 + e2) HT = e1 HT + e2 HT = sµ + sµ = 0
Mµ can be generated from:
{ } C∈+= cec with)( µµM
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 177Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example: (5,2) code C = {00000,10110,01011,11101}
µ eµ M µ sµ 0 00000 00000 10110 01011 11101 000 1 00001 00001 10111 01010 11100 001 2 00010 00010 10100 01001 11111 010 3 00100 00100 10010 01111 11001 100 4 01000 01000 11110 00011 10101 011 5 10000 10000 00110 11011 01101 110 6 11000 11000 01110 10011 00101 101 7 01100 01100 11010 00111 10001 111
=
1101001101
G
=
100100101100101
H
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 178Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Cyclic codes Subset of linear block codes
Definition: a linear (n,k) block code is called cyclic, if any cyclic shift of a codeword yields again a codeword:
Example:
CC ∈⇒∈ −−− ),,,,(),,,,( 21011210 nnn cccccccc
=
1011000010110000101100001011
G
(170)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 179Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Code words:
For all cyclic codes at least one generator matrix with band structure can be found.
u x0000 00000001000 11010000100 01101000010 00110100001 00011011110 10001100111 01000111101 1010001
u x1011 11111111010 11100100101 01110011100 10111000110 01011100011 00101111111 10010111001 1100101
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 180Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Generator matrices with band structure do not necessarily describe cyclic codes:
C = {00000, 11010, 01101, 10111}
A single row describes the generator matrix completely.
Description of a vector by a polynomial:
=
1011001011
G
( ) ∑−
=− =⇔=
1
0110 )(
n
i
iin xcxcccc c (171)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 181Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Examples for polynomials of code words of length n:
Summation of vectors corresponds to summation of polynomials:
c1 + c2 ⇔ c1(x) + c2(x)
c c(x)0000 ... 0 01000 ... 0 10100 ... 0 x0010 ... 0 x2
000 ... 01 xn−1
1111 ... 1 1 + x + x2 + ... + xn−1
1010 ... 0 1 + x2
(172)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 182Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
In general, cyclic codes are non-systematic codes.
Generator matrix of a cyclic code:
n − k + 1 non-zero entries in each row: g0, g1, g2, ... , gn−k
=
−
−
−
−
−
kn
kn
kn
kn
kn
gggggggg
gggggggg
gggg
210
210
210
210
210
00000000
000000000000
G
(173)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 183Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Information word: u = (u0, u1, u2, ... , uk−1)
Code word: c = (c0, c1, c2, ... , cn−1)
Calculation of code words using the generator matrix:
c = u G
c0 = u0 g0
c1 = u0 g1 + u1 g0
c2 = u0 g2 + u1 g1 + u2 g0
(174)
(175)
(176)
(177)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 184Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Simplified description information and code word vectors as well as generator matrix by polynomials:
u(x) = u0 + u1 x + u2 x2 + ... + uk−1 xk−1
c(x) = c0 + c1 x + c2 x2 + ... + cn−1 xn−1
g(x) = g0 + g1 x + g2 x2 + ... + gn−k xn−k
Coding rule: c(x) = u(x) ⋅ g(x)
c0 = u0 g0
c1 = u0 g1 + u1 g0
c2 = u0 g2 + u1 g1 + u2 g0
(178)
(179)
(180)
(181)
(184)
(183)
(182)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 185Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Theorem: The generator polynomial g(x) is a divisor of any code word c(x).
Example:
g(x) = 1 + x + x3, u(x) = 1 + x2 + x3
⇒ c(x) = u(x) ⋅ g(x) = 1 + x + x2 + x3 + x4 + x5 + x6 ⇔ c = (1111111)
Theorem: The generator polynomial g(x) of a (n,k) cyclic code is a divisor of the polynomial xn + 1 :
xn + 1 = 0 mod g(x) (185)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 186Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Proof: The following relation shall hold:
c = (c0, c1, c2, ... , cn−1) ∈ C ⇒ d = (cn−1, c0, c1, ... , cn−2) ∈ C
c(x) = c0 + c1 x + c2 x2 + ... + cn−1 xn−1
g(x) is divisor of c(x) :
c(x) = 0 mod g(x) ⇒ x ⋅ c(x) = 0 mod g(x)
d(x) = cn−1 + c0 x + c1 x2 + ... + cn−2 xn−1
= cn−1 + x (c0 + c1 x + c2 x2 + ... + cn−1 xn−1) + cn−1 xn
= c(x)
= x ⋅ c(x) + cn−1 (1 + xn)
(186)
(187)
(188)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 187Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
d is code word only if g(x) is divisor of d(x):
⇒ 1 + xn = 0 mod g(x)
A cyclic code is obtained only, if relation (189) holds for the generator polynomial
Properties of the generator polynomial:
Condition for the coefficients of the generator polynomial:
g0 = gn−k = 1
⇒ g(x) = 1 + g1 x + g2 x2 + ... + xn−k
(189)
(190)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 188Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Calculation of a generator polynomial:
Division of 1 + xn into factors – each factor may be used as generator polynomial
Example: x7 + 1 = (1 + x) ⋅ (1 + x + x3) ⋅ (1 + x2 + x3)
Calculation of the code word length for a given generator polynomial: backward Gaussian division algorithm: Example: (x3 + x + 1) ⋅ p(x) = xn + 1
x3 + 0 + x + 1x4 + 0 + x2 + x
x5 + 0 + x3 + x2
x7 + 0 + x5 + x4
x7 + 0 + 0 + 0 + 0 + 0 + 0 + 1 ⇒ n = 7
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 189Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
General representation of a code word polynomial:
c(x) = (u0 + u1 x + u2 x2 + ... + uk−1 xk−1) ⋅ g(x)
⇒ g(x) is the code word polynomial with lowest degree
Systematic coding
Multiplication of the information polynomial u(x) = u0 + u1 x + u2 x2 + ... + uk−1 xk−1 with xn−k :
u(x) ⋅ xn−k = u0 xn−k + u1 xn−k+1 + u2 xn−k+2 + ... + uk−1 xn−1
(corresponds to a shift of information bits into the direction of highest exponents)
(191)
(192)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 190Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Division of the polynomial u(x) ⋅ xn−k by g(x) :
u(x) ⋅ xn−k = q(x) ⋅ g(x) + r(x)
with q(x) = integer quotient
r(x) = remainder of the division, in general: r(x) ≠ 0
r(x) = r0 + r1 x + r2 x2 + ... + rn−k−1 xn−k−1
Forcing g(x) being a divisor of the right-hand side:u(x) ⋅ xn−k + r(x) = q(x) ⋅ g(x) = c(x)
Code word: c = (r0, r1, r2, ... , rn−k−1, u0, u1, u2, ... , uk−1)
(193))()()(
)()(
xgxrxq
xgxxu kn
+=⋅ −
(194)
(195)
(196)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 191Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example for systematic coding: g(x) = x3 + x + 1 , u = (1 0 0 1) , u(x) = x3 + 1 n = 7, k = 4, n − k = 3
u(x) ⋅ xn−k = x6 + x3
Division of the polynomial u(x) ⋅ xn−k by g(x) :
(x6 + 0 + 0 + x3) : (x3 + 0 + x + 1) = x3 + x ++ (x6 + 0 + x4 + x3)
(0 + 0 + x4 + 0 + 0 + 0)+ (x4 + 0 + x2 + x)
(0 + 0 + x2 + x) = r(x)
c(x) = u(x) ⋅ xn−k + r(x) = x6 + x3 + x2 + x ⇒ c = (0 1 1 1 0 0 1)
13
2
++
+
xxxx
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 192Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Parity-check matrix and parity-check polynomial h(x)
Relation for the generator polynomial: xn + 1 = g(x) ⋅ h(x)
h(x) is of degree k:
h(x) = h0 + h1 x + h2 x2 + ... + hk xk
with h0 = hk = 1
h(x) is the parity-check polynomial
Parity-check matrix H is of dimensions (n − k) × n
(197)
(198)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 193Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Parity-check matrix
=
−−
−−
−−
−−
−−
021
021
021
021
021
00000000
000000000000
hhhhhhhh
hhhhhhhh
hhhh
kkk
kkk
kkk
kkk
kkk
H
(199)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 194Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Decoding of the received vector y = c + e ⇔ y(x) = c(x) + e(x)
The generator polynomial is a divisor of any code word.
Division by the generator polynomial = calculation of the information word and test whether the received word is a code word:
s(x) = syndrome: y(x) = q(x) ⋅g(x) + s(x)
For s(x) ≠ 0, error correction is carried out using a syndrome table.
Realization of the division of polynomials by shift register circuits
No matrix multiplications for encoding and decoding necessary
(200))()()(
)()(
xgxsxq
xgxy
+=
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 195Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Implementation of encoding and decoding by shift register circuits Basic circuits
Shift register: each register cell delays by one clock period
Notation: Sj(i) = contents of register cell j at time i
Sj+1(i+1) = Sj(i)
S0 . . . .S1 Sm−1
a(x) x ⋅ a(x)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 196Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Adding device
Multiplication device
Scaling device
a a + b
b
a a ⋅ b
b
a a ⋅ bb
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 197Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Cyclic shift of a polynomial
Circuit calculates: xi ⋅ a(x) mod (xn + 1)
Most simple example of a feedback shift register
x0
. . . .
x1 x2 xn−1xn−2xn−3
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 198Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Multiplication of two polynomials
c(x) = a(x) ⋅ [b0 + b1 x + b2 x2 + ... + bn xn]
b0
. . . .a(x)
. . . .
b1 b2 bn
c(x)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 199Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Multiplication of two polynomials: alternative structure
c(x) = a(x) ⋅ [b0 + b1 x + b2 x2 + ... + bn xn]
b0
. . . .a(x)
. . . .
b1bn−1bn
c(x)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 200Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Channel coding in digital communication systems
Example or a multiplication
c(x) = a(x) ⋅ b(x) = (x4 + x2 + x +1 )⋅(x3 + x2 + 1)
= x7 + x6 + x5 + x4 + 0 + 0 + x + 1
b0 = 1 b1 = 0 b2 = 1 b3 = 1
10111
11110011
s0 s1 s2 s3
i s0 s1 s2 s3 ci
−1 0 0 0 0 00 1 0 0 0 11 1 1 0 0 12 1 1 1 0 03 0 1 1 1 04 1 0 1 1 15 0 1 0 1 16 0 0 1 0 17 0 0 0 1 18 0 0 0 0 0
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 201Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Division of two polynomials
q(x−1) = a(x−1) xm + q(x−1) [b0 xm + b1 xm−1 + ... +bm−1 x]
⇒ a(x−1) = q(x−1) [b0 + b1 x−1 + ... + bm−1 x−(m−1) + x−m] = q(x−1) b(x−1)
Substitution x−1 → x ⇒ a(x) = q(x) b(x) ⇒
b0
. . . .
a(x−1) . . . . q(x−1)
b1 bm−1
Coding Theory Channel coding in digital communication systems
)()()(
xbxaxq =
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 202Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Division of two polynomials with remainder:
The division process is stopped, when the last bit of a(x−1) has been input into the shift register.
⇒ The integer division is finished.
q(x−1) = output word
s(x−1) = contents of register cells
)()()(
)()(
xbxsxq
xbxa
+=
Coding Theory Channel coding in digital communication systems
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 203Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Example for the division of two polynomials with remainder:
(x8 + x3) : (x4 + x3 + x + 1) = x4 + x3 + x2 +
Coding Theory Channel coding in digital communication systems
134
23
+++
+
xxxxx
x0 x1 x2 x3 x4
000100001 001110000s0 s1 s2 s3
q(x−1)a(x−1)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 204Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Example for the division of two polynomials (cont'd)
Coding Theory Channel coding in digital communication systems
i ai s0 s1 s2 s3 qi
9 0 0 0 0 0 08 1 1+0=1 0+0=0 0 0+0=0 07 0 0+0=0 1+0=1 0 0+0=0 06 0 0+0=0 0+0=0 1 0+0=0 05 0 0+0=0 0+0=0 0 1+0=1 04 0 0+1=1 0+1=1 0 0+1=1 13 1 1+1=0 1+1=0 1 0+1=1 12 0 0+1=1 0+1=1 0 1+1=0 11 0 0+0=0 1+0=1 1 0+0=0 00 0 0+0=0 0+0=0 1 1+0=1 0
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 205Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Encoding circuit for a systematic cyclic code
Coding Theory Channel coding in digital communication systems
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 206Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Principle
1st step: inputting information bits u0, u1, u2, ... , uk−1, starting with uk−1
Input from the right hand side corresponds to a multiplication with xn−k
After inputting k information bits, the shift register contains the remainder from the division r(x)
2nd step: moving switches S1 and S2 into the dashed positions (disconnecting the feedback loop)
3rd step: output of parity-check bits
Coding Theory Channel coding in digital communication systems
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 207Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Example: u(x) = x3 + x2 + 1 and g(x) = x3 + x + 1
Coding Theory Channel coding in digital communication systems
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 208Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Example: u(x) = x3 + x2 + 1 and g(x) = x3 + x + 1
Coding Theory Channel coding in digital communication systems
t s1 b0 b1 b2 s2 s3
0 0 0 0 0 0 01 1 1 1 0 1 12 1 1 0 1 1 13 0 1 0 0 1 04 1 1 0 0 1 15 0 1 0 0 06 0 0 1 0 07 0 0 0 0 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 209Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Circuit for decoding: dividing the received word polynomial by the generator polynomial and calculating the syndrome
Coding Theory Channel coding in digital communication systems
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 210Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Circuit for decoding: inputting the received word from the right hand side
Coding Theory Channel coding in digital communication systems
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 211Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Decoding and error correction with the Meggitt decoder
Coding Theory Channel coding in digital communication systems
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 212Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Groups Definition: A set G and an operation ° are called Group (G,°), if the
following conditions are fulfilled:
The operation is associative: a ° (b ° c) = (a ° b) ° cwith a,b,c ∈ G
The set G is closed for the operation ° : if a,b ∈ G also a ° b ∈ G
The set G contains a neutral element e with the property: a ° e = e ° a = a for a, e ∈ G
For any element a of G there is an inverse element a′, so that: a ° a′ = a′ ° a = e for a, e ∈ G
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 213Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
The group is called commutative, if for any a,b ∈ G : a ° b = b ° a
The neutral element of a group is unique:
Proof with the assumption that there are two neutral elements e1 and e2: e1 = e1 ° e2 = e2 ° e1 = e2
Examples:
The set of integer numbers Z = {... −2, −1, 0, 1, 2, ...} is a commutative group with respect to addition:
neutral element: 0, inverse element for a: −a
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 214Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
The set of integer numbers Z = {... −2, −1, 0, 1, 2, ...} is no group with respect to multiplication:
There are no inverse elements.
Definition: If the group contains only a finite number of elements, it is called finite group.
Example: finite set of integer numbers G = {0, 1, 2, ... , m−1} and modulo-m addition „⊕“:
neutral element: 0, inverse element for a: a′ = m − a
rmqbamrq
mba
+⋅=+⇔+=+
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 215Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Example: binary set B = {0, 1} and modulo-2 addition „⊕“neutral element: 0, inverse elements: a = 1 ⇒ a′ = 1
a = 0 ⇒ a′ = 0
Example: G = {0, 1, 2, 3, 4, 5, 6} and modulo-7 addition „⊕“
neutral element: 0, inverse element for a:
a′ = m − a
⊕ 0 1 2 3 4 5 60 0 1 2 3 4 5 61 1 2 3 4 5 6 02 2 3 4 5 6 0 13 3 4 5 6 0 1 24 4 5 6 0 1 2 35 5 6 0 1 2 3 46 6 0 1 2 3 4 5
⊕ 0 10 0 11 1 0
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 216Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Example: finite set of integer numbers G = {1, 2, 3, ... , p−1} with p = prime number and modulo-p multiplication „⊗“:
closed set: the remainder of a ⋅ b / p is non-zero, since p is a prime numberneutral element: 1, inverse elements exist
rpqbaprq
pba
+⋅=⋅⇔+=⋅
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 217Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Example: p = 7 ⇒ G = {1, 2, 3, 4, 5, 6} and modulo-7 multiplication „⊗“
⊗ 1 2 3 4 5 61 1 2 3 4 5 62 2 4 6 1 3 53 3 6 2 5 1 44 4 1 5 2 6 35 5 3 1 6 4 26 6 5 4 3 2 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 218Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Cyclic groups
Definition: If all elements of a multiplicative group G = {1, g1, g2, ... , gm−1} can be represented as powers of at least one element gi , the group is called cyclic.
G = {1, g1, g2, ... , gm−1} consists of m elements gij :
G = {gi0, gi
1, gi2, ... , gi
m−1}
gi is called primitive element of the group (with order m)
„One“-element: 1 = gi0
Order (gk) = number of elements that can be created by gkl
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 219Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding Examples:
multiplicative modulo-5 group multiplicative modulo-7 group
G = {1, 2, 3, 4} G = {1, 2, 3, 4, 5, 6}
primitive elements
z z1 z2 z3 z4 order 1 1 1 1 1 1 2 2 4 3 1 4 3 3 4 2 1 4 4 4 1 4 1 2
z z1 z2 z3 z4 z5 z6 order 1 1 1 1 1 1 1 1 2 2 4 1 2 4 1 3 3 3 2 6 4 5 1 6 4 4 2 1 4 2 1 3 5 5 4 6 2 3 1 6 6 6 1 6 1 6 1 2
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 220Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Rings Definition: For the set R the operations ⊕ and ⊗ are defined and
the following properties:
R is a commutative group with restect to ⊕
R is closed with respect to ⊗ : a,b ∈ R → a ⊗ b ∈ R
R is associative with respect to ⊗ : a ⊗ (b ⊗ c) = (a ⊗ b) ⊗ cwith a,b,c ∈ R
The distributive property holds: a ⊗ (b ⊕ c) = (a ⊗ b) ⊕ (a ⊗ c) with a,b,c ∈ R
Commutative ring, if ⊗ is commutative
Ring with neutral element: a ⊗ 1 = 1 ⊗ a = a with a ∈ R
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 221Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Example of a commutative ring with a neutral element: integer numbers Z with addition and multiplication
Example of a commutative ring with a neutral element: integer numbers Zm = {0,1, 2, ... , m−1} with modulo-m addition and modulo-m multiplication
Unique inverse element only, if m = prime number
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 222Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Field Definition: For the set K the operations ⊕ and ⊗ are defined and the
following properties hold:
K is a commutative group with respect to ⊕ , 0 is neutral element
K without 0 is a commutative group with respect to ⊗, 1 is neutral element
Distributive property: a ⊗ (b ⊕ c) = (a ⊗ b) ⊕ (a ⊗ c) with a,b,c ∈ K
Field with a finite number of elements = Galois field
Integer numbers Zm = {0,1, 2, ... , m−1} form a Galois-Field GF(m) = Zm, if m is a prime number: m = p, or if m = pk (extended Galois field)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 223Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Some properties of Galois fields without proof:
a ⊗ 0 = 0 ⊗ a = 0
a ≠ 0 and b ≠ 0 → a ⊗ b ≠ 0 (no divisor of 0 – follows from closed multiplication)
a ⊗ b = 0 and a ≠ 0 → b = 0
− (a ⊗ b) = (−a) ⊗ b = a ⊗ (−b)
a ⊗ b = a ⊗ c and a ≠ 0 → b = c
1 ⊕ 1 ⊕ 1 ⊕ ... ⊕ 1 = 0
m terms
For each Galois field, at least one primitive element exists.
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 224Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Example: GF(2)
Example: GF(5)
⊕ 0 10 0 11 1 0
⊗ 0 10 0 01 0 1
⊕ 0 1 2 3 40 0 1 2 3 41 1 2 3 4 02 2 3 4 0 13 3 4 0 1 24 4 0 1 2 3
⊗ 0 1 2 3 40 0 0 0 0 01 0 1 2 3 42 0 2 4 1 33 0 3 1 4 24 0 4 3 2 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 225Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Examples in GF(5)
Addition by table: 1 + 2 = 3, 2 + 4 = 1, 4 + 4 = 3
Inverse elements of the addition: a + (−a) = 0
Subtraction using inverse elements of the addition:
3 − 2 = 3 + (−2) = 3 + 3 = 1
1 − 4 = 1 + (−4) = 1 + 1 = 2
a 0 1 2 3 4−a 0 4 3 2 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 226Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Examples in GF(5)
Multiplication by table: 1 ⋅ 2 = 2, 2 ⋅ 4 = 3, 4 ⋅ 4 = 1
Inverse elements of the multiplication: a ⋅ (a−1) = 1
Division using inverse elements of the multiplication:
3 ÷ 2 = 3 ⋅ (2−1) = 3 ⋅ 3 = 4
1 ÷ 3 = 1 ⋅ (3−1) = 1 ⋅ 2 = 2
a 1 2 3 4a−1 1 3 2 4
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 227Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Example in GF(5): system of linear equationsI: 2 x0 + x1 = 2II: 3 x1 + x2 = 3III: x0 + x1 + 2 x2 = 3
I: 2 x0 = 2 − x1II: x2 = 3 − 3 x12 ⋅ III: (2 − x1) + 2 x1 + 4 (3 − 3 x1) = 2 ⋅ 3
⇒ 2 + 4 ⋅ 3 − 2 ⋅ 3 = x1(1 − 2 + 4 ⋅ 3) ⇒ 3 = x1⇒ x2 = 3 − 3 x1 = 4⇒ x0 = 2−1 (2 − x1) = 2
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 228Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Extension fields G = {0, 1, 2, 3} with modulo-4 addition and modulo-4 multiplication
is not a Galois field !
⊕ 0 1 2 30 0 1 2 31 1 2 3 02 2 3 0 13 3 0 1 2
⊗ 0 1 2 30 0 0 0 01 0 1 2 32 0 2 0 23 0 3 2 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 229Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Extension fields Extension of Galois fields corresponding to the extension of real
numbers to complex numbers
Definition of complex numbers: R → C x2 + 1 = 0 does not have a solution for x ∈ R Extension of the field: Definition of complex numbers: c = c0 + c1α with c0, c1 ∈ R
Definition of an extended Galois field: GF (2) → GF(22) x2 + x + 1 = 0 does not have a solution for x ∈ GF (2) Extension of the field:
Def. of elements of GF(22): c = c0 + c1α with c0, c1 ∈ GF (2)
110!
1 22 −=⇔−=⇔=+ ααα
10!
1 22 +=⇔=++ αααα
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 230Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Definition of an extended Galois field: GF (p) → GF(pm) p(x) = xm + pm−1xm−1 + ... + p1x + p0 = 0 does not have a
solution for x ∈ GF (p) and x ∈ GF (pn) with n < m Extension of the field:
p(α) = 0 ⇔ αm = −pm−1αm−1 − ... − p1α − p0
Def. of elements of GF(pm): c = c0 + c1α + ... + cm−1αm−1
with ci ∈ GF (p) GF(pm) contains pm elements
Very important for coding in digital communication systems: GF(pm) mit p = 2
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 231Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Addition in C: Multiplication in C:
Addition in GF(22): Multiplication in GF(22):
ααα
)()()()(
1100
1010baba
bbaabac
+++=+++=
+=
αα
αα
)()1()(
)()(
10011100
11100100
1010
babababababababa
bbaabac
++−=−+++=
+⋅+=⋅=
ααα
)()()()(
1100
1010baba
bbaabac
+++=+++=
+=
ααα
αα
)()1()(
)()(
1110011100
11100100
1010
bababababababababa
bbaabac
++++=++++=
+⋅+=⋅=
(202)
(204)
(201)
(203)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 232Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Addition and multiplication tables for GF(22) = {0, 1, α, 1+α} in component representation:
⊕ 0 1 α 1+α0 0 1 α 1+α1 1 0 1+α αα α 1+α 0 1
1+α 1+α α 1 0
⊗ 0 1 α 1+α0 0 0 0 01 0 1 α 1+αα 0 α 1+α 1
1+α 0 1+α 1 α
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 233Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Interpretation as the remainder of polynomial divisions:
The condition x2 + x + 1 = 0 corresponds to a calculation modulo x2 + x + 1
Example in GF(22): (1 + α) ⋅ (1 + α) = 1 + α2
(α2 + 1) : (α2 +α + 1) = 1 ++ (α2 + α + 1)
α = r(α)
GF(22) is defined by all possible results of p(α) modulo α2 + α + 1
12 ++ααα
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 234Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Addition and multiplication tables for GF(22) in vector representation:
⊕ 00 10 01 1100 00 10 01 1110 10 00 11 0101 01 11 00 1011 11 01 10 00
⊗ 00 10 01 1100 00 00 00 0010 00 10 01 1101 00 01 11 1011 00 11 10 01
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 235Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Exponential representation
αk = αk mod 3
Addition and multiplication tables for GF(22) in exponential representation:
⊕ 0 1 α α2
0 0 1 α α2
1 1 0 α2 αα α α2 0 1α2 α2 α 1 0
⊗ 0 1 α α2
0 0 0 0 01 0 1 α α2
α 0 α α2 1α2 0 α2 1 α
0 = 01 = α0
α = α1
1+α = α2
1 = α3 = α0
α = α4 = α1
1+α = α5 = α2
... ...
)1(mod −=mpkk αα (205)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 236Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Definition: irreducible polynomial p(x) = polynomial of degree m, which cannot be written as a product of two polynomials pa(x), pb(x) with degree ≥ 1 and < m
Example: p(x) = x2 + x + 1
test polynomials of degree 1: pa(x) = x, pb(x) = x + 1
(x2 + x + 1) ÷ x = x + 1 +
(x2 + x + 1) ÷ (x + 1) = x +
Irreducible polynomials have similar properties as prime numbers
Theorem: irreducible polynomials do not exhibit zeros in GF (p)
x1
11+x
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 237Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Definition: primitive polynomial = irreducible polynomial p(x) of degree m, which exhibits a zero α (i.e. p(α) = 0) with the following property:
The terms αi mod p(α) create all possible (pm − 1) non-zero elements of the Galois field GF(pm)
α ∈ GF(pm) is called primitive element
with α0 = αn = 1 for n = pm − 1
For each Galois field GF(pm), there is at least one primitive polynomial p(x)
Primitive polynomials are irreducible.
In general, irreducible polynomials are not primitive polynomials.
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 238Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Representation of primitive polynomials by their zeros:
Factorization by all (n = pm − 1) non-zero elements ai ∈ GF(pm):
∏−
=−=
1
0
)( )()(m
i
pixxp α
∏−
=−=−
1
0)(1
n
ii
n axx
(206)
(207)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 239Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Table of primitive polynomials
m primitive polynomial m primitive polynomial 1 x + 1 9 x9 + x4 + 1 2 x2 + x + 1 10 x10 + x3 + 1 3 x3 + x + 1 11 x11 + x2 + 1 4 x4 + x + 1 12 x12 + x7 + x4 + x3 + 1 5 x5 + x2 + 1 13 x13 + x4 + x3 + x + 1 6 x6 + x + 1 14 x14 + x8 + x6 + x + 1 7 x7 + x + 1 15 x15 + x + 1 8 x8 + x6 + x5 + x4 + 1 16 x16 + x12 + x3 + x + 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 240Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Discrete Fourier transform (DFT)
Transform of polynomials:A(x) = DFT(a(x)) ⇔ a(x) = IDFT(A(x))
given: polynomials of degree ≤ n − 1 = pm − 2 with coefficients from GF(pm), z = primitive element of GF(pm):
∑−
=− =⇔=
1
0110 )(),...,,(
n
i
iin xaxaaaaa
∑−
=− =⇔=
1
0110 )(),...,,(
n
j
jjn xAxAAAAA
(208)
(209)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 241Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Representations of the DFT:
a(x) A(x)
a A
time domain frequency domain
Transformation rules:
∑−
=
⋅−− −=−=1
0)(
n
i
jii
jj zazaA
∑−
=
⋅==1
0)(
n
j
jij
ii zAzAa (211)
(210)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 242Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Properties of the DFT
Uniqueness
IDFT(DFT(a(x))) = a(x)
DFT(DFT(a(x))) = −a(x)
DFT(DFT(DFT(DFT(a(x))))) = a(x)
z−j is a zero of a(x) ⇔ Aj = 0 ⇔
zi is a zero of A(x) ⇔ ai = 0 ⇔
),...,,0,,...,,(0)( 11110 −+−− =⇔= njj
j AAAAAza A
),...,,0,,...,,(0)( 11110 −+−=⇔= niii aaaaazA a
(212)
(213)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 243Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Matrix representation:
⋅
=
−−−−
−−
− 1
210
)1()1(21
)1(242121
1
210
21
11
1111
nnnn
nn
n A
AAA
zzz
zzzzzz
a
aaa
⋅
−=
−−−−−−−
−−−−−−−−
− 1
210
)1()1(2)1(
)1(242)1(21
1
210
21
11
1111
nnnn
nn
n a
aaa
zzz
zzzzzz
A
AAA
(215)
(214)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 244Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Example: GF(7), z = 5, A(x) = 4 + 5x
⋅
=
543210
1234524024303034204254321
543210
11111
111111
AAAAAA
zzzzzzzzzzzzzzzzzzzzzzzzz
aaaaaa
=
⋅+
⋅=
⋅
=
506312
326451
5
111111
4
000054
546231421421616161241241326451111111
543210
aaaaaa
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 245Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Inverse transform: GF(7), z = 5, a(x) = 2 + x + 3x2 + 6x3 + 5x5
⋅
−=
−−−−−−−−−−−−−−−−−−−−−−−−−
543210
1234524024303034204254321
543210
11111
111111
aaaaaa
zzzzzzzzzzzzzzzzzzzzzzzzz
AAAAAA
⋅
−=
543210
5432142042303032402412345
543210
11111
111111
aaaaaa
zzzzzzzzzzzzzzzzzzzzzzzzz
AAAAAA
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 246Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Inverse transform: GF(7), z = 5, a(x) = 2 + x + 3x2 + 6x3 + 5x5
A = (4,5,0,0,0,0) a = (2,1,3,6,0,5)
=
−=
⋅
−=
000054
000023
506312
326451241241616161421421546231111111
543210
AAAAAA
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 247Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Shift register circuit for the DFT: ∑−
=
⋅−−=1
0
n
i
jiij zaA
a0z−j⋅0
z0
a0
Σ Aj
a1z−j⋅1
z−
1
a1
an−1z−j⋅(n−1)
z−(n−1)
an−1
−1start values
Memory contents after j cycles
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 248Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Shift register circuit for the IDFT:
A0zi⋅0
z0
A0
Σ ai
A1zi⋅1
z1
A1
An−1zi⋅(n−1)
zn−1
An−1
∑−
=
⋅=1
0
n
j
jiji zAa
start values
Memory contents after i cycles
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 249Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Convolution theorem of the DFT
given: polynomials a(x), b(x), c(x) of a degree ≤ n − 1 = pm − 2 with coefficients from GF(pm), z = primitive element of GF(pm):
x is element of GF(pm) ⇒ xn = x0 = 1 ⇒ calculation modulo xn − 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 250Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Product of two polynomials: a(x) ⋅ b(x) = c(x)
The product of two polynomials corresponds to the cyclic convolution of coefficients:
)...()...()...( 1110
1110
1110
−−
−−
−− +++=+++⋅+++ n
nn
nn
n xcxccxbxbbxaxaa
∑−
=−
−−
−−−
⋅=
++++⋅=++++⋅=
1
0mod
211201101
112211000...
...
n
jnijji
nn
nnn
abc
ababababcababababc
)1(mod)()( −⋅⇔∗ nxxbxaba
(216)
(217)
(218)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 251Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Convolution theorem:
a(x) ⋅ b(x) mod (xn−1) − Aj ⋅ Bj
ai ⋅ bi A(x) ⋅ B(x) mod (xn−1)
with a(x) A(x), b(x) B(x)
Proof of (219):
c(x) = a(x) ⋅ b(x) + γ(x) ⋅ (xn−1)
Cj = − c(z−j) = − a(z−j) ⋅ b(z−j) − γ(z−j) ⋅ (z−jn−1)
−Aj −Bj =1
(219)
(220)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 252Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Algebraic foundations for coding
Theorem for cyclic shift in time and frequency domain
x ⋅ a(x) mod (xn−1) z−j ⋅ Aj
zi ⋅ ai x ⋅ A(x) mod (xn−1)
with a(x) A(x)
Proof by inserting b(x) = x and B(x) = x into the convolution theorem
(221)
(222)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 253Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Reed-Solomon and Bose-Chaudhuri-Hocquenghem codes RS and BCH codes were developed about 1960
Analytical construction
Minimum distance can be preset (weight distribution is known)
The Singleton bound (dmin ≤ 1 + m) is fulfilled for RS codes with equality.
Powerful and of great practical relevance
Communication with space crafts
CD (compact disk), mobile radio communications
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 254Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Reed-Solomon codes Fundamental theorem of linear algebra:
A polynomial A(x) = A0 + A1 x + A2 x2 + ... + Ak−1 xk−1 of degree k−1 with coefficients Ai ∈ GF(pm) and Ak−1 ≠ 0 exhibits maximally k−1 different zeros.
Proof: representation of a polynomial by linear factors:
∏−
=− −⋅=
1
11 )()(
k
iik xxAxA (223)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 255Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Theorem of minimum weight of vectors:
Given: A(x) = A0 + A1 x + A2 x2 + ... + Ak−1 xk−1 with coefficients Ai ∈ GF(pm)
degree {A(x)} = k − 1 ≤ n − d
A(x) a(x) ⇔ a = (a0, a1, a2, .... , an−1)
⇒ wH(a) ≥ d
Proof:
(213): zeros of A(zi) correspond to elements ai = 0
⇒ number of non-zero elements: n − (k − 1) ≥ d
(224)
(225)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 256Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Definition of Reed-Solomon codes:
Given: z is primitive element of GF(pm).
a = (a0, a1, a2, .... , an−1) is code word of length n = pm − 1 ofthe RS code C with dimension k = pm − d and minimumdistance d = n − k + 1, if:
C = {a | ai = A(zi), degree{A(x)} ≤ k − 1 = n − d}
a = (a0, a1, a2, .... , an−1) A = (A0, A1, A2, .... , Ak−1,0,0, ... ,0)
d − 1 parity check frequencies
(226)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 257Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
For RS codes the Singleton bound
holds with equality: n − k = dmin − 1
optimum case: dmin is odd ⇒ t = (dmin − 1)/2
n − k = 2 t
⇒ number of errors that can be detected or corrected can be adjusted
Example:
Choice of the Galois field GF(pm): p = 2, m = 4 ⇒ GF(24)
Code word length: n = pm − 1 = 16 − 1 = 15
Number of correctable symbols: t = 3 ⇒ n − k = dmin − 1 = 6
)(1min knd −+≤
(227)
(228)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 258Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Code word in the frequency domain:
k = n − (dmin − 1) = 9 dmin − 1
Aj ∈ {0, 1, z, z+1, z2, z2+1, z2+z, z2+z+1, z3, z3+1, z3+z, z3+z+1, z3+z2, z3+z2+1, z3+z2+z, z3+z2+z+1}
Number of information symbols: k = 9
Number of information bits: k ⋅ m = 9 ⋅ 4 = 36
Number of code word symbols: n = 15
Number of code word bits: n ⋅ m = 15 ⋅ 4 = 60
A0 A1 A2 A3 A4 A5 A6 A7 A8 0 0 0 0 0 0
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 259Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
⇒ Hexadecimal (15,9) RS code = binary (60,36) code
Capability to correct errors: t = 3 hexadecimal symbols
⇒ maximum number of correctable bits: tb,max = 3 ⋅ 4 = 12
⇒ minimum number of correctable bits: tb,min = 3 ⋅ 1 = 3
Examples for error vectors in binary representation
e = (0000 0000 0000 1111 1111 1111 0000 0000 0000 0000 0000 0000 0000 0000 0000) ⇒ correctable
e = (0000 0000 0001 1111 1111 1110 0000 0000 0000 0000 0000 0000 0000 0000 0000) ⇒ not correctable
e = (0100 0000 0000 0000 0001 0000 0000 0100 0000 0000 0000 0000 0000 0010 0000) ⇒ not correctable
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 260Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Any 3 single bit errors and any 9 directly adjacent bit errors correctable
Error bursts correctable up to a length of
tb,burst = m (t − 1) + 1 = m (d − 3) / 2 + 1 bits
Code rate:
Example: k / n = 9 / 15 = 60 %
nd
ndn
nkR 11)1(
C−
−=−−
==
(229)
(230)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 261Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Generator polynomial
Code word in the frequency domain: Aj = 0 for j = k ... n − 1
DFT: Aj = − a(z−j)
Each code word polynomial exhibits zeros:
a(z−j) = 0 for j = k ... n − 1
Product of all linear factors which are obtained from the zeros:
Each code word polynomial can be represented as: a(x) = u(x) ⋅ g(x) ⇒ g(x) is the generator polynomial
∏∏−
=
−
=
− −=−=kn
j
jn
kj
j zxzxxg1
1)()()( (231)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 262Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Parity check polynomial
The parity check polynomial contains all other non-zero zeros:
Proof:
degree{g(x)} = n − k , degree{u(x)} = k − 1 , degree{g(x) ⋅ u(x)} = n − 1, degree{h(x)} = k
∏∏+−=
−
=
− −=−=n
knj
jk
j
j zxzxxh1
1
0)()()(
(232)
)1(mod0)1()()()(
)()()()()(1
0−=−⋅=−⋅=
⋅⋅=⋅
∏−
=
nnn
i
i xxxuzxxu
xhxgxuxhxa
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 263Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Example: (6,2) RS code for GF(7) with z = 5
43222
4
1
5
2
4652)65()56(
)2)(6()4)(5(
)5()5()(
xxxxxxxx
xxxx
xxxgi
i
i
i
++++=++⋅++=
−−⋅−−=
−=−= ∏∏==
−
)3)(1(334652
1)(
)(1)(
2432
6−−=++=
++++
−=
−=
xxxxxxxx
xxh
xgxxh
n(233)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 264Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Generator matrix (coding of cyclic codes): a = u ⋅ G
Example:
=
−−
−−
−
knkn
knkn
kn
gggggggg
gggggggg
gggg
210210
210210
210
00000000
000000000000
G
= 146520
014652G
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 265Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Transformation into a systematic code
substitute the first row by the sum of both rows:
multiply all elements by 4:
= 146520
1534021G
= 423610
465201systG
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 266Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Parity check matrix, 1st version
=
331000033100003310000331
1H
=
−−−−
−−−−
−−
021021
021021
021
1
00000000
000000000000
hhhhhhhh
hhhhhhhh
hhhh
kkkkkk
kkkkkk
kkk
H
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 267Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Parity check matrix, 2nd version: G = [Ik P] ⇒ H2 = [−PT In-k]
Parity check matrix, 3rd Version: direct evaluation of parity check equations
=
−−−−−−−−
=100033010051001042000115
100044010026001035000162
2H
1for
1for0)(1
0−=⋅−=
−==−=
∑−
=
−
−
nkjza
nkjzaAn
i
iji
jj
)1)(1(
1)1(2
2)1(
10
01
)1(1
221
00
+−−−
+−+−+
−−−
−−
−−−−⋅−=−−−−⋅−=
knn
kkk
knn
kkk
zazazazaAzazazazaA
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 268Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Matrix representation:
Example:
=
⋅
−
−−−−−−−
+−−+−+−
+−−+−+−
−−−−
0
000
1
111
1
2
1
0
)1()1(2)1(
)2)(1()2(2)2(
)1)(1()1(2)1(
)1(2
2
nnnn
knkk
knkk
knkk
a
aaa
zzz
zzzzzz
zzz
(234)
H3
=
=−
−−−−−−−−−−−−−−−−
326451241241616161421421
1111
12345240243030342042
3
zzzzzzzzzzzzzzzzzzzz
H
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 269Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Calculation of the syndrome
Received signal: r = a + e R = A + E
r(x) = a(x) + e(x) R(x) = A(x) + E(x)
Syndrome = received vector in the frequency domain at the parity check frequencies
A(x) = A0x0 + A1x1 + ... + Ak−1xk−1
E(x) = E0x0 + E1x1 + ... + Ek−1xk−1 + Ekxk + ... + En−1xn−1
R(x) = R0x0 + R1x1 + ... + Rk−1xk−1 + Ekxk + ... + En−1xn−1
with Rj = Aj + Ej syndrome coefficients
S(x) = S0x0 + ... + Sn−k−1xn−k−1 = Ekx0 + ... + En−1xn−k−1
(235)
(236)
(237)
(238)
(239)
(241)
(240)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 270Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Syndrome as a vector: S = (S0, S1, ... , Sn−k−1)
Syndrome as a multiplication with the parity check matrix:
S = r ⋅ H3T ⇔ ST = H3 ⋅ rT (242)
⋅
−=
−−−−−−−
+−−+−+−+−−+−+−
−−−−
−− 1
210
)1()1(2)1(
)2)(1()2(2)2()1)(1()1(2)1(
)1(2
1
210
21
111
nnnn
knkkknkkknkk
kn r
rrr
zzz
zzzzzz
zzz
S
SSS
(243)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 271Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Encoding of RS codes
Extending the definition (226):
C = {a | ai = A(zi) with Aj = 0 for n − k cyclically adjacent symbols j}
Proof: cyclic shift theorem of the DFT
Method 1: encoding in the frequency domain
Transformation of information symbols with the IDFT
A = (u0, u1, u2, .... , uk−1,0,0, ... ,0) a = (a0, a1, a2, .... , an−
1)
(244)
(245)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 272Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
u0 A0 a0 c0
u1 A1 a1 c1
u2 A2 a2 c2
uk−1 Ak−1
0 Ak
0 Ak+1
0 An−1 an−1 cn−1
....
................
................
∑−
=⋅=
1
0
n
j
ijji zAa
IDFT....
....
....
Infor-mationvector
Codeword
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 273Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Example: (6,2) RS code for GF(7) with z = 5
n = p − 1 = 7 − 1 = 6, k = 2 ⇒ degree{A(x)} ≤ k − 1 = 1
A(x) = 3 + 3x ai = A(x = zi)
a0 = A(x = z0 = 50 = 1) = A0 + A1z0 = 3 + 3 ⋅ 1 = 6 a1 = A(x = z1 = 51 = 5) = A0 + A1z1 = 3 + 3 ⋅ 5 = 4 a2 = A(x = z2 = 52 = 4) = A0 + A1z2 = 3 + 3 ⋅ 4 = 1 a3 = A(x = z3 = 53 = 6) = A0 + A1z3 = 3 + 3 ⋅ 6 = 0 a4 = A(x = z4 = 54 = 2) = A0 + A1z4 = 3 + 3 ⋅ 2 = 2 a5 = A(x = z5 = 55 = 3) = A0 + A1z5 = 3 + 3 ⋅ 3 = 5
A = (3,3,0,0,0,0) a = (6,4,1,0,2,5)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 274Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
B(x) = 1 + 1x
B = (1,1,0,0,0,0) b = (2,6,5,0,3,4)
dH(a,b) = dH((6,4,1,0,2,5),(2,6,5,0,3,4)) = 5
for comparison: dmin = n − k + 1 = 6 − 2 + 1 = 5
Method 2: non-systematic encoding in the time domain
Multiplication of information and generator polynomial
a(x) = u(x) ⋅ g(x) (246)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 275Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Method 3: Systematic encoding in the time domain
Multiplication of the information polynomial with xn−k :
u(x) ⋅ xn−k = u0xn−k + u1xn−k+1 + ... + uk−1xn−1
Division of u(x) ⋅ xn−k by g(x) :
u(x) ⋅ xn−k = q(x) ⋅ g(x) + r(x)
result = remainder: r(x) = r0 + r1x + ... + rn−k−1xn−k−1
Code word:
− r(x) + u(x) ⋅ xn−k = q(x) ⋅ g(x) = a(x)
a = (−r0, −r1, −r2, ... , −rn−k−1, u0, u1, u2, ... , uk−1)
(247)
(248)
(249)
(250)
(251)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 276Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Example: (6,2) RS code for GF(7) with z = 5
u = (3,4) ⇒ u(x) = 3 + 4x ⇒ u(x) ⋅ xn−k = 3x4 + 4x5
Division by g(x) = 2 + 5x + 6x2 + 4x3 + x4 :
(4x5 + 3x4) ÷ (1x4 + 4x3 + 6x2 + 5x + 2) = 4x + 1 +− (4x5 + 2x4 + 3x3 + 6x2 + 1x)
1x4 + 4x3 + 1x2 + 6x − (1x4 + 4x3 + 6x2 + 5x + 2)
2x2 + 1x + 5 = r(x)
−r(x) + u(x) xn−k = 2 + 6x + 5x2 + 3x4 + 4x5 ⇒ a = (2, 6, 5, 0, 3, 4)−r u
)()(
xgxr
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 277Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Alternative: Encoding with the generator matrix of a systematic code:
u = (3,4)
Example: (7,3) RS code for GF(23)
» code word length: n = pm − 1 = 8 − 1 = 7
» number of correctable symbols: t = 2 ⇒ n − k = 4, k = 3
» primitive polynomial: p(x) = 1 + x + x3
» primitive element: 1 + z + z3 = 0
( ) ( )0,5,6,2,4,34236104652014,3syst =
⋅=⋅= Gub
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 278Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
» extended Galois field:
» generator polynomial:g(x) = (x − z) (x − z2) (x − z3) (x − z4)
= (x2 + xz4 + z3) (x2 + xz6 + 1)= x4 + x3z3 + x2 + xz + z3
» parity check polynomial:h(x) = (x − z5) (x − z6) (x − z7)
= x3z3 + x2z3 + xz2 + z4
0 = 0 z1 = z z3 = 1+z z5 = 1+z+z2
z0 = 1 z2 = z2 z4 = z+z2 z6 = 1+z2
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 279Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
» encoding of: u = (1, 1, 1) ⇒ u(x) = 1 + x + x2
⇒ u(x) ⋅ xn−k = x4 + x5 + x6
division by g(x) = x4 + x3z3 + x2 + xz + z3 :
(x6 + x5 + x4) ÷ (x4 + x3z3 + x2 + xz + z3) = x2 + xz + z4 + + (x6 + x5z3 + x4 + x3z + x2z3)
x5z + x3z + x2z3
+ (x5z + x4z4 + x3z + x2z2 + xz4)x4z4 + x2z5 + xz4
+ (x4z4 + x3 + x2z4 + xz5 + 1)x3 + x2 + x + 1 = r(x)
−r(x) + u(x) xn−k = 1+x+x2+x3+x4+x5+x6 ⇒ a = (1,1,1,1,1,1,1)
)()(
xgxr
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 280Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
» generator matrix:
Weight function of RS codes (holds also for maximum distance separable codes in general, q = pm):
=
110001100011
3333
33
zzzzzz
zzzG
<≤=
=di
iAi 1for 0
0for 1
∑=
=n
i
ii zAzA
0)(
diqj
iq
in
Adi
j
jdiji ≥
−−−
= ∑
−
=
−− for 1
)1()1(0
(252)
(253)
(254)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 281Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Example for a weight distribution: (7,3) RS code for GF(23)
Minimum distance: d = (n − k) + 1 = 5
A0 = 1
A1 = A2 = A3 = A4 = 0
1477217)!57(!5
!71757
5 =⋅=⋅−⋅
=⋅⋅
=A
147)58()!67(!6
!7)1158(76
76 =−⋅
−⋅=⋅
−⋅⋅
=A
217)154864(7)12681
68(777 2
7 =+−⋅=⋅
+⋅
−⋅⋅
=A
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 282Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Weight function:
Correctness check:
RS codes are especially suited for correction of burst errors
RS codes may be inefficient when correcting single errors, since always whole symbols have to be corrected.
The capability of correcting symbols requires more redundancy than that of correcting single bits.
∑=
===+++=n
ii qA
0
33 85122171471471
765 2171471471)( zzzzA +++=
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 283Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Decoding RS codes
Algebraic decoding
Decoding by tables: sufficient memory
Algebraic decoding
Procedure
Calculation of the syndrome
Calculation of error positions (key equation)
Calculation of the values of the error vector (in general, RS codes are non-binary codes)
Correction of errors
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 284Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Calculation of error positions
Error position polynomial: zeros indicate error positions
c(x) C(x)
definition: ci = 0 for ei ≠ 0
⇒ ci ⋅ ei = 0 for i = 0, 1, ... , n − 1
each error position ci = 0 generates a zero / a linear factor in C(x):
degree{C(x)} = number of error positions e
(255)
(256)
∏≠
−=0,
)()(iei
izxxC (257)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 285Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
DFT of the product ci ⋅ ei :
ci ⋅ ei = 0 C(x) ⋅ E(x) = 0 mod (xn − 1)
C(x) ⋅ E(x) exhibits all possible zeros
assumption:
error position polynomial:
C(x) = C0 + C1x + ... + Cexe
selecting C0 : C0 = 1
(258)
(259)
2knte −
=≤
(260)e
eei
i xCxCxzxCi
+++=⋅−= ∏≠
− ...1)1()( 10,
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 286Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Error position polynomial
a(x) xxxxxxxxxxxxxxxxxxxx A(x) xxxxxxxxxxxxxx000000
e(x) 00000x000x000000x000 E(x) xxxxxxxxxxxxxxxxxxxx
r(x) xxxxxxxxxxxxxxxxxxxx R(x) xxxxxxxxxxxxxxxxxxxx S(x)
c(x) xxxxx0xxx0xxxxxx0xxx C(x) xxxx0000000000000000
ci⋅ei = 0 C(x)⋅E(x) = 0 mod (xn−1)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 287Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Calculation of coefficients Ci : assumption: e = t Representation of C(x) ⋅ E(x) = 0 mod (xn − 1) as an equation
system:
00
000
00
1322110
2423120
1212110
222110
12322110
1120110
221100
=+++=++++
=++++=++++=++++
=++++=++++
−−−−−
−−−−−
+−−−−+−
−−−−−−
−−−−−−−−
+−−
−−−
tntnnn
tntnnn
tnttntntn
tnttntntn
tnttntntn
tntn
tntnn
ECECECECECECECEC
ECECECECECECECECECECECEC
ECECECECECECECEC
(261)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 288Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Frame in (261):
t equations and t variables
All required coefficients for calculation of Ci are known:
S0 = En−2t , S1 = En−2t+1 , .... , S2t−1 = En−1
⇒
= cyclic convolution:
(262)
0
00
1322221120
112110022110
=++++
=++++=++++
−−−−
−+−−
ttttt
tttttttt
SCSCSCSC
SCSCSCSCSCSCSCSC
∑=
− −==⋅t
iiji ttjSC
012,,for0 (263)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 289Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Alternative representation by multiplication of polynomials C(x)⋅S(x) :
11213
02221212
132222112012
1121101
022110
0121101
01101
000
0
00
−−−
−−
−−−−−
−++
−−
−−−−
=
=+++=++++
=++++=++++
+++
+
tttt
ttttt
tttttt
ttttt
ttttt
tttt
TSCx
TSCSCSCxSCSCSCSCx
SCSCSCSCxSCSCSCSCx
SCSCSCx
SCSCxSCx
(264)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 290Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Definition of the polynomial T(x), which can be used for calculation of error values:
T(x) = T0 + T1x + ... + Tt−1xt−1
Comparison of coefficients ⇒ key equation:
C(x) ⋅ E(x) = 0 mod (xn − 1)
= T(x) ⋅ (xn − 1) = T(x) ⋅ xn − T(x)
In general, the number of errors is unknown: e ≠ t
Errors with small weight are more likely than errors with large weight ⇒ assumption: e ≤ t
(265)
(266)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 291Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Searched coefficients: (C0,) C1, C2, ... , Ce
Number of variables: e, number of equations: t Overdetermined equation system ⇒ different approaches are
required for C(x):
Matrix representation:
(267)
te
CC
CC
SSSSSSSS
SSSSSSSS
e
e
etettt
etettt
ee
ee
,,2,1for
1
1
0
1222212
22123222
121
011
==
⋅
−
−−−−−
−−−−−−
+
−
0
tetejSCe
iiji ,,2,1for12,,for0
0 =−==⋅∑
=−
(268)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 292Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Starting the search for coefficients of the error position polynomial with the most likely error event: e = 1, e = 2, ...
Decoder failure is possible, if no error position polynomial with e zeros is found.
Solution of the key equation with low computational complexity:
Berlekamp-Massey algorithm
Euklid’s divisions algorithm
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 293Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Example: (6,2)-RS-Code for GF(7) with z = 5
Parameters: n = 6, k = 2, d = dH,min = 5, t = 2
Received vector and syndrome:
r = (r0,r1,r2,r3,r4,r5) = (1,2,3,1,1,1)
R = (R0,R1,R2,R3,R4,R5) = (R0,R1,E2,E3,E4,E5) = (5,0,4,6,6,1)
S = (E2,E3,E4,E5) =(S0,S1,S2,S3) = (4,6,6,1)
Assumption e = 1: C(x) = C0 + C1x normalization: C1 = 1
S
=
⋅
=
⋅
000
1616646
010
231201 C
CC
SSSSSS
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 294Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
6 C0 + 4 = 0 ⇒ C0 = 46 C0 + 6 = 0 ⇒ C0 = 61 C0 + 6 = 0 ⇒ C0 = 1
Contradiction ⇒ e = 2: C(x) = C0 + C1x + C2x2 normalization: C2 = 1
I: 6 C0 + 6 C1 + 4 = 0II: 1 C0 + 6 C1 + 6 = 0I−II: 5 C0 − 2 = 0 ⇒ C0 = 2⋅5−1 = 6 II: 6 C1 + 5 = 0 ⇒ C1 = −5⋅6−1 = 5
=
⋅
=
⋅
00
1661466
10
210
123012 C
C
CCC
SSSSSS
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 295Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Error position polynomial: C(x) = 6 + 5 x + x2
Search for zeros (Chien search):C(x = z0 = 1) = 6 + 5 + 1 = 5 C(x = z1 = 5) = 6 + 4 + 4 = 0 ⇒ zero at z1
C(x = z2 = 4) = 6 + 6 + 2 = 0 ⇒ zero at z2
C(x = z3 = 6) = 6 + 2 + 1 = 2 C(x = z4 = 2) = 6 + 3 + 4 = 6 C(x = z5 = 3) = 6 + 1 + 2 = 2
Errors at the 1st and 2nd symbol position
Test: C(x) = (x − z1)⋅(x − z2) = 6 + 5 x + x2
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 296Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Recursive calculation of symbol error values
Eq. (258) corresponds to a cyclic convolution of coefficients:
ci ⋅ ei = 0 C(x) ⋅ E(x) = 0 mod (xn − 1)
with C(x) = C0 + C1x + ... + Cexe
and E(x) = E0 + E1x + ... + Ek−1xk−1 + Ekxk + Ek+1xk+1 + ... + En−1xn−1
unknown part known part (S)
) mod(index 1,,1,0for00
nnjECe
iiji −==⋅∑
=− (269)
) mod(index 1,,1,0for01
0 nnjECECe
iijij −==⋅+ ∑
=− (270)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 297Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Solving for Ej and recursive calculation of error values:
Example (contd.): (6,2) RS code for GF(7) with z = 5
R = (R0,R1,E2,E3,E4,E5) = (5,0,4,6,6,1) C(x) = 6 + 5 x + x2
E0 = −C0−1 ⋅ (C1 ⋅ E−1 + C2 ⋅ E−2)
= −C0−1 ⋅ (C1 ⋅ E5 + C2 ⋅ E4)
= −6−1 ⋅ (5 ⋅ 1 + 1 ⋅ 6) = 1 ⋅ (5 + 6) = 4
) mod(index 1,,1,0for1
10 nnjECCE
e
iijij −=⋅⋅−= ∑
=−
− (271)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 298Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
E1 = −C0−1 ⋅ (C1 ⋅ E0 + C2 ⋅ E−1)
= −C0−1 ⋅ (C1 ⋅ E0 + C2 ⋅ E5)
= −6−1 ⋅ (5 ⋅ 4 + 1 ⋅ 1) = 1 ⋅ (6 + 1) = 0
Complete error vector in the frequency domain:
E = (E0,E1,E2,E3,E4,E5) = (4,0,4,6,6,1)
Transformation back into the time domain:
e = (e0,e1,e2,e3,e4,e5) = (0,1,2,0,0,0)
Result:
= r − e = (1,2,3,1,1,1) − (0,1,2,0,0,0) = (1,1,1,1,1,1) a
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 299Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Correcting errors and erasures for RS codes
Discrete memoryless channel with erasures for transmission of symbols from GF(pm)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 300Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Description of the errors-and-erasures channel:
r = a + e + b
Input vector: a =(a0,a1,a2,...,an−1) with ai ∈ GF(pm)
Output vector: r =(r0,r1,r2,...,rn−1) with ri ∈ {GF(pm),?}
Error vector: e =(e0,e1,e2,...,en−1) with ei ∈ GF(pm)
Erasure vector: b =(b0,b1,b2,...,bn−1) with bi ∈ {0,?}
Calculation rules for erasures (with ai ∈ GF(pm)):
≠=
=⋅
=+
0for?0for0
?
??
i
ii
i
aa
a
a
(272)
(273)
(274)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 301Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Error position polynomial:
Erasure position polynomial:
degree{Ce(x)} = number of errors
degree{Cb(x)} = number of erasures
Numbers of errors e and erasures b that can be corrected:
b + 2 ⋅ e ≤ 2 ⋅ t = d − 1 = n − k
∏≠
−=0,
)()(iei
ie zxxC
∏=
−=?,
)()(ibi
ib zxxC
(275)
(276)
(277)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 302Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Procedure
Modification of the received word: multiplication with the erasure position polynomial
Calculation of error positions
Calculation of error and erasure values
1. Modification of the received word
Erasure position polynomial in the time domain Cb(x) cb(x) :
cb,i = Cb(x = zi) = 0 für {i|bi = ?}
⇒ bi ⋅ cb,i = 0
(278)
(279)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 303Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Definition:
Transformation into the frequency domain:
(280)
(281)
bcrr ×=~
ii
ibiibiibiiiibii
ea
cecacbeacrr~~
)(~,,,,
+=
⋅+⋅=⋅++=⋅=
)1mod()(~)(~)1mod()()()()(
)1mod()())()(()()()(~
−+=
−⋅+⋅=
−⋅+==
nii
nbb
nbb
xxExA
xxCxExCxA
xxCxExAxCxRxR
(282)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 304Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Effect of the multiplication in the frequency domain:
(n−2t) symbols 2t symbols
(n−2t) symbols b symbols (2t−b) symbols (syndrome )
ibii crr ,~ ⋅=)0,,0,0,,,,( 110 −= kAAAA
)0,,0,0,~,,~,~,~,,~,~(~11110 −++−= bkkkk AAAAAAA
)~,,~,~,~,,~,~,~,,~,~(~1111110 −+++−++−= nbkbkbkkkk RRRRRRRRR R
)~,,~,~,~,,~,~,~,,~,~(~1111110 −+++−++−= nbkbkbkkkk EEEEEEEEE E
)~,,~,~,~,,~,~,~,,~,~( 1111110 −+++−++−= nbkbkbkkkk EEERRRRRR
S~
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 305Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Length of the remainder of the syndrome: (2t−b) symbols
⇒ number of correctable errors
2. Calculation of error positions
Restriction of the key equation on the remainder of the polynomial
Key equation:
⇒ error positions and erasure positions are known
−=≤2maxbtee (283)
)~,,~,~()~,,~,~(~111210 max −+++− == nbkbke EEESSS S (284)
(285)max
0max, ,,2,1for12,,for0~ eeeejSC
e
iijif =−==⋅∑
=−
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 306Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
3. Calculation of error and erasure values
Errors-and-erasures polynomial
Modification of the received vector
Substitution of erasures by an arbitrary value (0):
Error vector with
?für0ˆ)(fürˆ
==∈=
ii
miiirr
prrr GF
∏=∩≠
−=⋅=?)()0(,
)()()()(ii bei
ibe zxxCxCxC
rr ˆ→
e
(286)
(287)
)0()0(for0ˆ?)()0(for0ˆ
=∩===∪≠≠
iii
iiibeebee
(288)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 307Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Modified received vector:
estimate for the transmitted code word
Condition for error-and-erasures position polynomial:
Recursive calculation of error values corresponding Eq. (271):
EARear ˆˆˆˆˆˆ +=+= (289)
) mod(index 1,,1,0forˆˆ1
10 nnjECCE
be
iijij −=⋅⋅−= ∑
+
=−
−
(290)
(291)
=a
)1(mod0)(ˆ)(0ˆ −=⋅=⋅ nii xxExCec
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 308Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
with C(x) = C0 + C1x + ... + Ce+bxe+b
and
unknown part known part ( )
)ˆ,,,ˆ,ˆ,ˆ,,ˆ,ˆ(ˆ 11110 −+−= nkkk EEEEEE E (293)
(292)
S
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 309Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Example: (6,2) RS code for GF(7) with z = 5
Parameters: n = 6, k = 2, d = dH,min = 5, t = 2
Code word vector: A = (A0,A1,A2,A3,A4,A5) = (3,3,0,0,0,0)
Code word vector: a = (a0,a1,a2,a3,a4,a5) = (6,4,1,0,2,5)
Error vector: e = (e0,e1,e2,e3,e4,e5) = (0,5,0,0,0,0)
Erasure vector: b = (b0,b1,b2,b3,b4,b5) = (0,0,0,?,?,0)
Received vector: r = (r0,r1,r2,r3,r4,r5) = (6,2,1,?,?,5)
Erasure position polynomial: Cb(x) = (x−z3)⋅(x−z4) = x2 + 6x + 5
Number of erasures: b = 2
Number of correctable errors: 1222
2max =−=
−=≤btee
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 310Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Transform of the erasure position polynomial into time domain:
Cb = (5,6,1,0,0,0) cb = (5,4,3,0,0,4)
1. Modification of the received vector:
Transform into the frequency domain:
Syndrome of the modified received vector:
)6,0,0,3,1,2()4,0,0,3,4,5()5?,?,,1,2,6(~ =×=×= bcrr
)5,4,2,2,1,2(~)6,0,0,3,1,2(~ == Rr
S~
)5,4()~,~(~10 == SSS
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 311Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Assumption for the error position polynomial: e = 1, emax = 1
Ce(x) = Ce,0 + Ce,1x
Definition: Ce,1 = 1
Key equation:
5Ce,0 + 4 = 0 ⇒ Ce,0 = −4 ⋅ 5−1 = 3 ⋅ 3 = 2
Error position polynomial: Ce(x) = 2 + x
Error position search: Ce(zi) = 0
Ce(z0=1) = 2 + 1 = 3, Ce(z1=5) = 2 + 5 = 0 ⇒ i = 1
( ) ( ) 01
45~~ 0,
1,
0,01 =
⋅=
⋅ e
e
e CCC
SS
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 312Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Errors-and-erasures polynomial:
C(x) = Ce(x) ⋅ Cb(x) = (x + 2)⋅(x + 1)⋅(x + 5) = x3 + x2+ 3x + 3
(C0,C1,C2,C3) = (3,3,1,1)
2. modification of the received vector:
Recursive calculation of the error values:
)0,2,0,1,3,0(ˆ)5,0,0,1,2,6(ˆ == Rr
S)0,2,0,1,ˆ,ˆ()ˆ,ˆ,ˆ,ˆ,ˆ,ˆ(ˆ 10543210 EEEEEEEE ==E
n) mod(index 1,,1,0forˆˆ1
10 −=⋅⋅−= ∑
+
=−
− njECCEbe
iijij
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 313Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Estimate of the error vector in the frequency domain:
345
3032021011
00
ˆˆˆ)ˆ~ˆ(ˆ
EEE
ECECECCE −−−− ++⋅−=
422)012103(3ˆ 10 =⋅=⋅+⋅+⋅⋅−= −E
450
3132121111
01
ˆˆˆ)ˆˆˆ(ˆ
EEE
ECECECCE −−−− ++⋅−=
002)210143(3ˆ 11 =⋅=⋅+⋅+⋅⋅−= −E
)0,2,0,1,0,4()ˆ,ˆ,ˆ,ˆ,ˆ,ˆ(ˆ 543210 == EEEEEEE
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 314Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Error correction:
aera
AERA
==−=−=
==−=−=
)5,2,0,1,4,6()0,5,0,0,5,0()5,0,0,1,2,6(ˆˆˆ
)0,0,0,0,3,3()0,2,0,1,0,4()0,2,0,1,3,0(ˆˆˆ
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 315Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
BCH codes (Bose-Chaudhuri-Hocquenghem codes) Extension of Hamming codes
Optimized for correction of several statistically independent single errors
Special case of RS codes
Binary Code, code word in the time domain: ai ∈ GF(2)
Code word in the frequency domain: Ai ∈ GF(2m)
Definition similar as RS codes in the frequency domain by zeros at parity check frequencies in the time domain by a generator polynomial
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 316Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Conjugate elements
Two elements a, b ∈ GF(pm) are called conjugate with each other a ∼ b, if a can be written as a power of b:
Properties:
for all a ∈ GF(pm) : a ∼ a
for all a, b ∈ GF(pm) : a ∼ b ⇒ b ∼ a
for all a, b, c ∈ GF(pm) : a ∼ b and b ∼ c ⇒ a ∼ c
(294)rpba =
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 317Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Equivalence class: subsets composed from all conjugate elements [zi] = {zi, zj, zk, zl, ...} with zi ∼ zj ∼ zk ∼ zl ...
Equivalence classes build up a disjunct partition of GF(pm)
Circular classes
Origin: complex numbers:
nth root of 1:
Extended Galois fields can be viewed as circular fields:
Circular classes Kj contain all exponents of elements which are conjugate with each other:
(295)
12j == πezn
101 ===− zzz npm
1with}1,,1,0|mod{ −=−=⋅= mij pnminpjK
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 318Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Example: GF(24) with primitive polynomial p(x) = x4 + x + 1 and primitivem Element z mit z4 + z + 1 = 0
Elements in GF(24): GF(24) = {0, z0, z1, z2, ... , zn−1}
with n = pm − 1 = 24 − 1 = 15
Calculation modulo n in the exponents: z0 = zn
Equivalence class: {z1, z2, z4, z8}, circular class: {1, 2, 4, 8}
( )( )
( )( )
( )( ) 11628181
11624141
11622121
1
2
3
?~
?~
?~
zzzzzz
zzzzzz
zzzzzz
===⇒
===⇒
===⇒
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 319Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Minimal polynomial, definition and properties
For each a ∈ GF(pm) there is a uniquely defined normalized polynomial m[a](x) of minimum degree with coefficients from GF(p) for which a is a zero. ⇒ minimal polynomial
m[a](x) is irreducible
Two conjugate elements have the same minimal polynomial: a ∼ b ⇒ m[a](x) = m[b](x)
Degree of the minimal polynomial = number of elements of the equivalence class
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 320Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Calculation of the minimal polynomial:
Product of all minimal polynomials from GF(pm)
One of the minimal polynomials corresponds to the primitive polynomial.
∏∏∈∈
−=−=][,][,
][ )()()(a
i
Kii
p
abba axbxxm (296)
1)(][ −=∏ n
ia xxm
i (297)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 321Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Example (continued)
m0(x) ⋅ m1(x) ⋅ m3(x) ⋅ m5(x) ⋅ m7(x) = x15 − 1
Kreisteilungsklasse MinimalpolynomK0 = {0} m0(x) = (x − z0) = x + 1K1 = {1,2,4,8} m1(x) = (x − z1) (x − z2) (x − z4) (x − z8) = x4 + x + 1K3 = {3,6,9,12} m3(x) = (x − z3) (x − z6) (x − z9) (x − z12) = x4 + x3 + x2 + x + 1K5 = {5,10} m5(x) = (x − z5) (x − z10) = x2 + x + 1K7 = {7,11,13,14} m7(x) = (x − z7) (x − z11) (x − z13) (x − z14) = x4 + x3 + 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 322Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Derivation of BCH codes in the frequency domain
Binary code: ai ∈ GF(2), Ai ∈ GF(2m)
Code word length: n = 2m − 1
DFT for odd n: n mod 2 = 1 ⇒ n−1 = 1
Ai = a(x = z−i) , aj = A(x = zj)
Zeros in the frequency domain: Ai = 0 = a(x = z−i)
Theorem of polynomials f(x) in GF(2): [f(x)]2 = f(x2)
A2i = a(x = z−2i) = a(x = (z−i)2) = [a(x = z−i)]2 = (Ai)2
A4i = a(x = z−4i) = a(x = (z−2i)2) = [a(x = z−2i)]2 = (A2i)2 = (Ai)4
A8i = a(x = z−8i) = a(x = (z−4i)2) = [a(x = z−4i)]2 = (A4i)2 = (Ai)8
(298)
(299)
(300)(301)
(302)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 323Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
General condition for which ai ∈ GF(2):
The (n,k,dmin) BCH code is defined by the parameters p, m, d which can be selected:code word length: n = pm − 1minimum distance: dmin ≥ ddimension: k ≤ n + 1 − dmin ≤ n + 1 − dnumber of parity check symbols: n − k ≥ 2 t
C = {a | ai = A(zi), degree{A(x)} ≤ n − d, ai ∈ GF(p), Ai ∈ GF(pm)}
with z = primitive element from GF(pm).
(303)( ) )2()2(
jj ii AA =⋅
(304)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 324Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
binary BCH code: p = 2
C = {a | ai = A(zi), degree A(x) ≤ n − d, ai ∈ GF(2), Ai ∈ GF(2m)}
C = {a | ai = A(zi), degree A(x) ≤ n − d, A(2 j⋅i) = (Ai)(2 j), Ai ∈ GF(pm)}
Example: BCH code for GF(23) with n = 23 − 1 = 7, t = 1, d = 3
Code word vector: A = (A0,A1,A2,A3,A4,A5,A6) = (A0,A1,A2,A3,A4,0,0)
A0: A(2 j⋅ 0) = A0 = (A0)(2 j) ⇒ A0 ∈ GF(2)
A1: A(2 j⋅ 1) = A(2 j) = (A1)(2 j)
⇒ A2 = (A1)2 , A4 = (A1)4 , A8 mod 7 = A1 = (A1)8 ⇒ A1 ∈ GF(23)
(305)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 325Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
A3: A(2 j⋅ 3) = (A3)(2 j)
⇒ A6 = (A3)2 , A12 = A5 = (A3)4, A24 = A3 = (A3)8 ⇒ A3 ∈ GF(23)
but: A5 = A6 = 0 ⇒ A3 = 0
Encoding example with p(x) = 1 + x + x3, 1 + z + z3 = 0, prim. el.: z
A0 = 1, A1 = z ⇒ A = (1,z,z2,0,z4,0,0), A(x) = 1 + z x + z2 x2 + z4 x4
a0 = A(z0) = 1+z1+z212+z414 = 1+z+z2+z4 = 1+z+z2+z+z2 = 1
a1 = A(z1) = 1+zz+z2z2+z4z4 = 1+z2+z4+z8 = 1+z2+z+z2+z = 1 ...
⇒ a = (1,1,0,1,0,0,0)
0 = 0 z1 = z z3 = 1+z z5 = 1+z+z2
z0 = 1 z2 = z2 z4 = z+z2 z6 = 1+z2
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 326Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Generating a BCH code by DFT
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 327Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Constructing BCH codes in the time domain with generator polynomial
Kj are circular classes with respect to n = 2m − 1
z is primitive element in GF(pm)
M is the union of the circular classes: M = {Kj1∪Kj2
∪...}
A BCH code of length n = 2m − 1 is determined by the generator polynomial:
If there are d − 1 subsequent numbers in M, there designed minimum distance is d .Dimension : k = n − grad (g(x))
(306)⋅⋅=−= ∏∈
)()()()(21
xmxmzxxg jjMi
i
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 328Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Example in GF(24) with n = 24 − 1 = 15
I:
designed minimum distance: d = 3, since M = {K1,K2} = K1 = {1,2,4,8} contains d − 1 = 2 subsequent numbers ( M = K1 = {K1∪K2} !)Dimension: k = n − degree {g(x)} = 15 − 4 = 11
II:
designed minimum distance: d = 5, since M = {K1∪K3} = {1,2,3,4,6,8,9,12} contains d − 1 = 4 subsequent numbers Dimension: k = n − degree {g(x)} = 15 − 8 = 7
1)()()( 41
},{ 21
++==−= ∏∈
xxxmzxxgKKi
i
1)1()1(
)()()()(
46782344
31},,,{ 4321
++++=++++⋅++=
⋅=−= ∏∈
xxxxxxxxxx
xmxmzxxgKKKKi
i
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 329Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
III:
designed minimum distance: d = 7, since M = {K1∪K3∪K5} = {1,2,3,4,5,6,8,9,10,12} contains d − 1 = 6 subsequent numbersDimension: k = n − degree {g(x)} = 15 − 10 = 5
1
)1()1()1(
)()()()()(
245810
22344
531},,,{ 621
++++++=
++⋅++++⋅++=
⋅⋅=−= ∏∈
xxxxxx
xxxxxxxx
xmxmxmzxxgKKKi
i
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 330Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Block codes
Encoding of BCH codes as any other linear cyclic codes
non-systematic encoding by multiplication with a generator polynomial
systematic by division by the generator polynomial
Difference with respect to RS codes: shift of parity check frequencies has an influence on the performance
Decoding similar to RS codes
advantage: for a binary code, calculation of error values is not necessary
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 331Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Properties and definition no blockwise generation of code words, but convolution of a whole
sequence with a set of generator coefficients
no analytical methods for construction of code words ⇒ computer search
simple processing of reliability information of the demodulators (soft decision input)
sensitive with respect to burst errors
usually binary codes
ML decoding with the Viterbi algorithm
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 332Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Definition of convolutional codes:
Information blocks ur = (ur,1,ur,2, ... ,ur,k)
Code blocks ar = (ar,1,ar,2, ... ,ar,n)
Binary code: ai,j , ui,j ∈ GF(2) = {0,1}
Encoding is an assignment with memory: The actual code block is determined by the actual and m previous information blocks:
ar = ar(ur, ur−1, ... , ur−m)
Convolutional codes are linear ⇒ linear combination of information bits:
∑ ∑= =
−⋅=m k
rr uga0 1
,,,,µ κ
κµνµκν
(307)
(308)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 333Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Meaning of indices:
r = block number
k = length of information blocks
n = length of code blocks
m = memory length
µ = memory index
κ = index for the position within an information block
ν = index for the position within a code block
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 334Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Binary Coefficients: gκ,µ,ν ∈ GF(2) = {0,1}
with 1 ≤ κ ≤ k , 0 ≤ µ ≤ m , 1 ≤ ν ≤ n
L = m + 1 = constraint length
Size of memory (in bits): k ⋅ (m + 1), (minimum k ⋅ m)
Code rate:
Block structure is not of great importance:
typical values: k = 1, 2, n = 2, 3, 4, m ≤ 8
Block codes can be seen as a special case of convolutional codes with memory length m = 0
nkR =C (309)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 335Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Block diagram of a general (n,k,m) convolutional encoder
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 336Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example 1:
n = 2, k = 1, m = 2
(2,1,2) convolutional encoder
RC = 1/2
ar,1 = ur + ur−1 + ur−2
ar,2 = ur + ur−2
u = (1,1,0,1,0,0,...)
a = (11,01,01,00,10,11,...)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 337Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example 2: n = 3, k = 2, m = 2
(3,2,2) convolutional encoder
RC = 2/3
ar,1 = ur,2 + ur−1,1 + ur−2,2
ar,2 = ur,1 + ur−1,1 + ur−1,2
ar,3 = ur,2
u = (11,10,00,11,01,...)
a = (111,010,010,111,001,...)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 338Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Restriction for the following: k = 1
Description by polynomials
Generator polynomial:
Sequence of information bits:
Sequence of code blocks:
∑=
⋅=m
xgxg0
,)(µ
µνµν
∑∞
==
0)(
r
rr xuxu
( ) ∑∞
===
0,21 )(with)(,),(),()(
r
rrn xaxaxaxaxax ννa
(311)
(310)
(312)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 339Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Convolutional encoding corresponds to multiplication of polynomials:
⇔ a(x) = u(x) ⋅ G(x)with the generator matrix:
Defining convolutional codes by multiplication of polynomials:
Memory length:
nxgxuxa ,,1for)()()( =⋅= ννν
(314)
(313)
(315)
( ) ( ))(,),(),()()(,),(),( 2121 xgxgxgxuxaxaxa nn ⋅=
( ))(,),(),()( 21 xgxgxgx n=G (316)
(317)
∈=⋅= ∑∞
=0}1,0{,)(|)()(
rr
rr uxuxuxxu GC
)}({degreemax1
xgmn
νν ≤≤
= (318)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 340Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example 1:
G(x) = (g1(x),g2(x)) = (1 + x + x2, 1 + x2)
u = (1,1,0,1,0,0,...) ⇔ u(x) = 1 + x + x3
a(x) = (a1(x),a2(x)) = u(x) G(x)
= ((1 + x + x3)(1 + x + x2),(1 + x + x3)(1 + x2))
= (1 + x4 + x5,1 + x + x2 + x5)
⇔ a = (11,01,01,00,10,11,00,00,...)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 341Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Alternative description of encoding by convolution of vectors
gν = (g0,ν , g1,ν , g2,ν , ... , gm,ν )
gν corresponds to the impulse response of a transversal filter
Code sequence for each bit of an output block:
aν = u ∗ gν ⇔ for ν = 1, 2, ... n
with u = (u0, u1, u2, ... )
Ordering of bits:
a = (a1,0, a2,0, ... , an,0, a1,1, a2,1, ... , an,1, a1,2, a2,2, ... , an,2, ... )
∑=
− ⋅=m
ii gua0
,,µ
νµµν
(320)
(319)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 342Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example 1:
g1(x) = 1 + x + x2, g2(x) = 1 + x2
g1 = (111), g2 = (101)
information sequence: u = (1,1,0,1,0,0,...)
result of graphical convolution:
a1 = (1,0,0,0,1,1,0,0,...)
a2 = (1,1,1,0,0,1,0,0,...)
⇒ a = (11,01,01,00,10,11,00,00,...)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 343Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Graphical convolution
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 344Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Special classes of convolutional codes Systematic convolutional codes
Information bits are copied to coded bits
Usually, non-systematic convolutional codes show better properties
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 345Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Block diagram of a general systematic convolutional encoder
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 346Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Terminated convolutional codes
Inserting m known blocks (tail bits – usually zeros) after Linformation blocks
Termination ⇒ block structure ⇒ block code
Coderate: nmL
kLR⋅+
⋅=
)(terminatedC, (321)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 347Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Punctured convolutional codes (usually k = 1)
Procedure
Convolutional encoding with a mother code with rate RC = 1/n
Combination of P code blocks (P = puncturing length)
Erase l code bits corresponding to a puncturing scheme (puncturing matrix P)
Code rate:
RC,punctured < 1 ⇒ l < P ⋅ (n − k)
lnPkPR−⋅
⋅=puncturedC, (322)
(323)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 348Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
High practical importance, especially RCPC codes (rate compatible punctured convolutional codes)
High rate codes are included in low rate codes.
Simple switching of code rates
Almost the same performance as non-punctured convolutional codes (k > 1) with the same code rate
Less computational effort for decoding if compared with non-punctured convolutional codes with k > 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 349Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
• Example:
l = 1
u = (1,1,0,1,0,0,...)
a = (11,01,01,00,10,11,...)
apunctured = (11,0 ,01,0 ,10,1 ,...)
Coding Theory Convolutional codes
32
12212
puncturedC, =−⋅
⋅=R
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 350Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Representation of convolutional codes Shift register circuit
Generator polynomials: multiplication of polynomials, convolution
Code tree
Node = state of the memory
2k branches starting at each node
Exponential increase of nodes by each encoding step
Number of possible paths:
with Nblock = number of processed blocks
( ) blockblock 22pathsNkNkN ⋅==
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 351Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example for a code tree
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 352Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
State diagram
Representation without repetition (no temporal information)
Idea: output block depends only on input block and contents of the memory
New contents of memory depends on the input block and old memory content
The minimum number of encoding steps to go from state A to state B can be read from a state diagram
Number of states:( ) mkmkN ⋅== 22states (324)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 353Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Notation:
= ur−1 ur−2
input bits (output bits)
= ur (ar,1 ar,2)
Example 1
n = 2, k = 1, m = 2
RC = 1/2
memory contents =
state
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 354Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes Example 2:
n = 3, k = 2, m = 2RC = 2/3
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 355Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Trellis diagram
Corresponds state diagram with an additional temporal component
Basis for decoding
Trellis segment with all possible transitions of example 1
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 356Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Trellis diagram for example 1 (encoding example)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 357Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Properties of convolutional codes Linearity: convolution corresponding to Eq. (319) is a linear
operation
The minimum distance is not suitable for convolutional codes −code words of convolutional codes do not exhibit a certain length
Definition of the free distance df : minimum Hamming distance between code sequences of arbitrary different information sequences
df = min {dH(a1,a2)| u1 ≠ u2} (325)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 358Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Linear code ⇒ distance with respect to the all-zeros sequence is sufficient for the calculation
Free distance df = minimum weight of all possible code sequences:
Optimum convolutional code: code with the maximum free distance for a given code rate and memory length
Return-to-zero path − definition: path, that leaves the all-zeros path and goes back to the all-zeros path without touching the all-zeros path in between
≠=⋅= ∑∞
=0Hf 0)(,)(|))()((min
i
ii xuxuxuxxuwd G (326)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 359Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Table of optimum convolutional codes
m df g1(x) g2(x) g3(x)2 5 1 + x + x2 1 + x2
3 6 1 + x + x3 1 + x + x2 + x3
4 7 1 + x3 + x4 1 + x + x2 + x4
5 8 1 + x2 + x4 + x5 1 + x + x2 + x3 + x5
6 10 1 + x2 + x3 + x5 + x6 1 + x + x2 + x3 + x6
2 8 1 + x + x2 1 + x2 1 + x + x2
3 10 1 + x + x3 1 + x + x2 + x3 1 + x2 + x3
4 12 1 + x2 + x4 1 + x + x3 + x4 1 + x + x2 + x3 + x4
5 13 1 + x2 + x4 + x5 1 + x + x2 + x3 + x5 1 + x3 + x4 + x5
6 15 1 + x2 + x3 + x5 + x6 1 + x + x4 + x6 1 + x + x2 + x3 + x4 + x6
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 360Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Search for a path that leaves the all-zeros path and returns to the all-zeros path with minimum weight
= return-to-zero path with minimum weight
Procedure:
Cut the state diagram at the all-zeros state
Describing the distance with the parameter D
Shortest path from 00 to 00 :
00 → 10 → 01 → 00 ⇒ wH = 5(11) (10) (11)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 361Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Cutting the state diagram at the all-zeros state
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 362Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Two paths with wH = 6
00 → 10 → 11 → 01 → 00 ⇒ wH = 6
00 → 10 → 01 → 10 → 01 → 00 ⇒ wH = 6
Generating function, transfer function:
(11) (01) (01) (11)
(11) (10) (00) (10) (11)
DD
DDDD
DDDDTkk
21
)2421(
42)(
5
25
765
−=
+++++⋅=
+++=
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 363Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Description of states by formal variables Zi
Equation system for the cut state diagram:
Zb = D2Za + D0Zc
Zc = D1Zb + D1Zd
Zd = D1Zb + D1Zd
Ze = D2Zc
Solving the equation system yields the generating function:
DD
ZZDT
a
e21
)(5
−==
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 364Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Development into a power series:
Result:
⇒ df = 5
1for11
1 432 <+++++=−
xxxxxx
+++++=−
⋅= 987655 16842211)( DDDDD
DDDT
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 365Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Generating function (extended version)
Introducing further formal variables into the state diagram
Variable for the distance (Hamming weight of the code word sequence): D
Variable for input ones (Hamming weight of the information word sequence): N
Variable for numbering of the encoding steps: L
Most important properties of the code can be read directly from the generating function.
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 366Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example for a generating function:
++⋅++++=
+⋅−==
37736735725624635
35
2
)1(1),,(
NLDNLDNLDNLDNLDNLD
LDLNNLD
ZZNLDT
a
e
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 367Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Relation between minimum distance dmin and the number of encoding steps i: distance profile
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 368Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Catastrophic error propagation: a limited number of transmission errors leads to an infinite number of errors in the decoder output
Criterion to detect a catastrophic encoder:
There are at least two different states that are not left if the information word does not change.
Code sequences for these states are identical.
Requirement to avoid catastrophic encoders k = 1:
Generator polynomials must not have a common divisor:
gcd(g1(x), g2(x), ... , gn(x)) = 1
(327)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 369Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example for a catastrophic encoder:
g1(x) = 1 + x
g2(x) = 1 + x2 = (1 + x) ⋅ (1 + x)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 370Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes Decoding
Viterbi decoding
Low computational effort (can be realized up to m ≈ 10)
Optimum (maximum likelihood decoding)
Sequential decoding
Suboptimum, low memory requirement (can be realized up to m ≈10 … 20)
Maximum likelihood decoding, Viterbi metric
Terminated code sequence a of length N , received sequence r
Goal: estimation of a code sequence , which coincides as good as possible with the transmitted sequence a
a
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 371Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Search for a code sequence , for which the likelihood function p(r|a) becomes maximum
Corresponds to maximize the log-likelihood function log p(r|a)
Memoryless channel:
Maximum-likelihood detection corresponds to maximize the metric
= metric increment
aa ˆ=
∑∏−
=
−
===
1
0
1
0)|(log)|(logand)|()|(
N
iii
N
iii arpparpp arar
( ) ∑∑
∑
−
=
−
=
−
=
=+⋅=
+⋅=
1
0
1
0
1
0
)|()|(log
)|(log)|(
N
iii
N
iiii
N
iii
ararp
arpM
µβα
βαar
)|( ii arµ
(328)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 372Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Binary symmetric channel (BSC)
with
and
Metric increment with
{ } { }1,1, 21 +−=∈ xxai
{ } { }1,1, 21 +−=∈ yyri
−=+=−
=
≠=−
=
1for1for1
forfor1
)|(
err
err
err
err
ii
ii
ii
iiii
raprap
raprap
arp
errerr
err log1and1log/2 pp
pi αβα −−=
−=
iiii
iiii ra
rara
ar =
−=−+=+
=1for11for1
)|(µ (329)
y2
y1
1−perr
perr
perr
x1
x21−perr
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 373Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Maximizing the Viterbi metric corresponds to:
minimizing the Hamming distance or
maximizing the number of coincident bits.
The product metric is also optimum for the AWGN channel.
Viterbi algorithm: decoding scheme using the trellis diagram
assigning a metric to each of the 2k ⋅m states (start value = 0)
comparing the 2k possible code sequences with the received sequence = calculation of the Viterbi metric
metric of the subsequent state = metric of the old state + metric increment
iiii raar =)|(µ
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 374Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
from all 2k arriving paths, only the path with the highest metric will be maintained – it is called survivor
all other 2k − 1 paths are erased, since they can never reach a higher metric
paths with the same metric are chosen randomly
Decoding result = path with highest metric
Practical systems: restriction to paths of length 5 ⋅ k ⋅ m bit
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 375Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Memory requirement ≈ number of states × path length
= 2k ⋅ m ⋅ 5 ⋅ k ⋅ m bit
hard decision decoding: Viterbi decoding after taking hard decisions
soft decision decoding: Viterbi decoding directly with analog samples of the received signal (using reliability information)
Calculation of residual error probability
» simulation
» using the weight function
residual errors: error bursts
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 376Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example 1: (2,1,2) convolutional code
Information vector: u = (1,1,0,1,0,0,0,1,0,...)
Code sequence vector: a = (11,01,01,00,10,11,00,11,10,...)
Error vector: f = (00,10,00,00,01,00,00,00,00,...)
Received vector: r = (11,11,01,00,11,11,00,11,10,...)
Hard decision decoding
Used metric: number of coincident bits
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 377Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example: 1. step
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 378Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example: 2. step
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 379Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example: 3. step
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 380Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example: 4. step
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 381Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example: 5. step
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 382Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example: 6. step
step
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 383Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example: 7. step
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 384Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example: 8. step
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 385Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Convolutional codes
Example: 9. step
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 386Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Coding techniques
Concatenation of codes
Important: requirements to the outer code are very much reduced if an adapted interleaver is used
outerchannelencoder
inter-leaver
innerchannelencoder
channel
outerchanneldecoder
deinter-leaver
innerchanneldecoder
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 387Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Coding techniques
Interleaver: Reordering of bits so that at the output of the deinterleaver quasi-
single errors are obtained Transparent transmission Problem: delay Interleaving for bits or blocks
Delay in coded transmission systems Delay because of the block structure of encoder and decoder Delay because of interleavers Delay from the decoding process (fundamental problem for
convolutional codes)
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 388Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Coding techniques
Block
interleaver
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 389Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory Outlook
Capacity of an AWGN channel with continuous
signal alphabet and discrete
signal alphabet
Prof. Dr.-Ing. Andreas Czylwik Coding TheorySS 2011
p. 390Fachgebiet Nachrichtentechnische Systeme
N T SUNIVERSITÄT
D U I S B U R GE S S E N
Coding Theory
Thanks for your attention and
much success for the examination!