Date post: | 04-Jun-2018 |
Category: |
Documents |
Upload: | mailstonaik |
View: | 217 times |
Download: | 0 times |
of 31
8/13/2019 1.lecture1
1/31
CODING THEORYCODING THEORY
A Birds Eye ViewA Birds Eye View
8/13/2019 1.lecture1
2/31
Text Books
Shu Lin and Daniel. J. Costello Jr., Error Control Coding:
Fundamentals and applications, Prentice Hall Inc.
R.E. Blahut, Theory and Practice of Error Control Coding, MGH
References:
Rolf Johannesson, Kamil Sh. Zigangirov, Fundamentals of
Convolutional Coding, Universities Press (India) Ltd. 2001.
Proakis, Digital Communications, MGH.
8/13/2019 1.lecture1
3/31
8/13/2019 1.lecture1
4/31
Slide 41
8/13/2019 1.lecture1
5/31
8/13/2019 1.lecture1
6/31
8/13/2019 1.lecture1
7/31
8/13/2019 1.lecture1
8/31
8/13/2019 1.lecture1
9/31
8/13/2019 1.lecture1
10/31
8/13/2019 1.lecture1
11/31
8/13/2019 1.lecture1
12/31
8/13/2019 1.lecture1
13/31
8/13/2019 1.lecture1
14/31
Shannons TheoremShannons Theorem (1948)(1948)
Noisy Coding Theorem due toNoisy Coding Theorem due to ShannonShannon::
Roughly: Consider channel with capacityRoughly: Consider channel with capacity CC. If we are. If we arewilling to settle for a rate of transmission that is strictlywilling to settle for a rate of transmission that is strictly
below C, then there is an encoding scheme for thebelow C, then there is an encoding scheme for the
source datasource data,, thatthatwill reduce the probability of awill reduce the probability of adecision error to any desired level.decision error to any desired level.
Problem: Proof is not constructive! To this day, no oneProblem: Proof is not constructive! To this day, no one
has found a way to construct the coding schemeshas found a way to construct the coding schemes
promised by Shannons theorem.promised by Shannons theorem.
8/13/2019 1.lecture1
15/31
Shannons TheoremShannons Theorem (1948)(1948)--contdcontd
Additional concerns:Additional concerns:
Is the coding scheme easy to implement, both inIs the coding scheme easy to implement, both inencoding and decoding?encoding and decoding?
May require extremely long codes.May require extremely long codes.
8/13/2019 1.lecture1
16/31
The ShannonThe Shannon--Hartley TheoremHartley Theorem
GGivesives us a theoretical maximumus a theoretical maximum bitbit--rate that canrate that can
be transmitted with an arbitrarily small bitbe transmitted with an arbitrarily small bit--errorerrorrate (rate (BERBER), with a), with a given average signal power,given average signal power,
over a channel with bandwidthover a channel with bandwidth B HzB Hz,, which iswhich is
affected byaffected byAWGN.AWGN. ForForany givenany given BERBER, however small, we can find, however small, we can find
a coding techniquea coding technique that achieves thisthat achieves this BERBER;;
smaller the givensmaller the given BERBER, the more complicated, the more complicatedwill be thewill be the codingcoding technique.technique.
8/13/2019 1.lecture1
17/31
ShannonShannon--Hartley TheoremHartley Theorem--contdcontd..
Let the channel bandwidth beLet the channel bandwidth be B HzB Hz and signaland signal
to noise ratio beto noise ratio be S/NS/N (not in dB).(not in dB).
sec/)/1(log2
bitsNSBC +=
8/13/2019 1.lecture1
18/31
ShannonShannon--Hartley TheoremHartley Theorem--contdcontd..
FFor a given bandwidthor a given bandwidth BB and a givenand a given S/NS/N,,
we can find a way ofwe can find a way oftransmitting data at atransmitting data at abitbit--raterate RR b it s/ se c o n db it s/ se c o n d, with a bit, with a bit--error rateerror rate
((BERBER) as low as we) as low as we like, as long aslike, as long as RR CC ..
Now assume we wish to transmit at anNow assume we wish to transmit at anaverageaverage energy/bitenergy/bit ofof EEbb and theand the AWGNAWGN
noise has two sided power spectral densitynoise has two sided power spectral density
NN00 /2/2 Watts perWatts per Hz. It follows that the signalHz. It follows that the signalpowerpower S =S =EEbbRR and the noise powerand the noise power N = NN = N00BB
Watts.Watts.
8/13/2019 1.lecture1
19/31
ShannonShannon--Hartley TheoremHartley Theorem--contdcontd..
R/BR/B ratio is calledratio is called bandwidth efficiencybandwidth efficiency inin
bits/sec/Hzbits/sec/Hz. How many bits per sec do I get for. How many bits per sec do I get foreach Hz of bandwidth.We want this to be aseach Hz of bandwidth.We want this to be as
high as possible.high as possible. EEbb /N/N
00 is theis the normalisednormalised
average energy/bitaverage energy/bit, where the normalisation is, where the normalisation is
with respect to thewith respect to the one sided PSD of the noise.one sided PSD of the noise.
The law gives the following bounds:The law gives the following bounds:
8/13/2019 1.lecture1
20/31
8/13/2019 1.lecture1
21/31
Shannon LimitShannon Limit
TheThe boundbound gives the minimum possiblegives the minimum possiblenormalised energynormalised energy per bitper bit satisfying thesatisfying theShannonShannon--HHartleyartley law.law.
If we draw a graph ofIf we draw a graph of (E(Ebb/N/N00 ))minmin againstagainst (R/B)(R/B)we observe the thatwe observe the that ((EE
bb/N/N
00))minmin
never goes lessnever goes lessthan about 0.69 which is aboutthan about 0.69 which is about --1.61.6dBdB..
Therefore if our normalised energy per bit is lessTherefore if our normalised energy per bit is less
thanthan --1.6dB1.6dB, we can never, we can neversatisfy the Shannonsatisfy the Shannon--Hartley law, howeverHartley law, howeverinefficient (in terms ofinefficient (in terms ofbit/sec/Hz) we arebit/sec/Hz) we are prepared to be.prepared to be.
8/13/2019 1.lecture1
22/31
Shannon LimitShannon Limit--contd.contd.
There exists a limiting value ofThere exists a limiting value of (E(Ebb/N/N00)) belowbelow
which there cannot be error free communicationwhich there cannot be error free communicationat any transmission rate.at any transmission rate.
The curveThe curve R = CR = Cwill divide the achievable andwill divide the achievable andnonnon--achievable regions.achievable regions.
8/13/2019 1.lecture1
23/31
8/13/2019 1.lecture1
24/31
ModulationModulation--Coding tradeCoding trade--offoff
ForFor PPbb=10=10--55,, BPSKBPSKmodulation requiresmodulation requires EEbb/N/N00 == 9.6dB9.6dB
(optimum un(optimum un--coded binary modulation)coded binary modulation)
For this case, Shannons work promised aFor this case, Shannons work promised aperformance improvement ofperformance improvement of 11.2dB11.2dB over theover theperformance of unperformance of un--coded binary modulation,coded binary modulation,through the use of coding techniques.through the use of coding techniques.
Today,Today, Turbo Codes,Turbo Codes, are capable of achieving anare capable of achieving an
improvement close to this.improvement close to this. Turbo Codes are Near Shannon limit errorTurbo Codes are Near Shannon limit error
correcting codescorrecting codes
8/13/2019 1.lecture1
25/31
Coding TheoryCoding Theory--IntroductionIntroduction
Main problem:Main problem:
A stream of source data, in the form of 0s and 1s, isA stream of source data, in the form of 0s and 1s, isbeing transmitted over a communication channel, suchbeing transmitted over a communication channel, suchas a telephone line. Occasionally, disruptions can occuras a telephone line. Occasionally, disruptions can occur
in the channel, causing 0s to turn into 1s and vicein the channel, causing 0s to turn into 1s and viceversa.versa.
Question: How can we tell when the original data hasQuestion: How can we tell when the original data has
been changed, and when it has, how can we recover thebeen changed, and when it has, how can we recover theoriginal data?original data?
8/13/2019 1.lecture1
26/31
Coding TheoryCoding Theory--IntroductionIntroduction
Easy things to try:Easy things to try:
Do nothing. If a channel error occurs withDo nothing. If a channel error occurs withprobabilityprobability pp, then the probability of making a, then the probability of making adecision error isdecision error is pp..
Send each bit 3 times in successionSend each bit 3 times in succession..The bit thatThe bit thatoccurs the majority of the timeoccurs the majority of the time,, gets picked.gets picked.
(E.g. 010 => 0)(E.g. 010 => 0) Repetition codes!!Repetition codes!!
8/13/2019 1.lecture1
27/31
Coding TheoryCoding Theory--IntroductionIntroduction
Generalize above:Generalize above:Send each bitSend each bit nn times, choose majoritytimes, choose majority
bit. In this way, we can make the probability of makingbit. In this way, we can make the probability of makinga decision error arbitrarily small, but inefficient in termsa decision error arbitrarily small, but inefficient in terms
of transmission rate.of transmission rate.
AsAs nn increases the achievableincreases the achievable BERBER reduces, at thereduces, at theexpense of increased codeword length (reduced codeexpense of increased codeword length (reduced code
rate)rate)
Repetition coding is inefficientRepetition coding is inefficient
8/13/2019 1.lecture1
28/31
Coding Theory Introduction (contd)Coding Theory Introduction (contd)
Encode source information, by adding additionalEncode source information, by adding additional
informationinformation ((redundancyredundancy)), that can be used to detect,, that can be used to detect,and perhaps correct, errors in transmission.and perhaps correct, errors in transmission.TThe morehe more
redundancy we add, the more reliably we can detect andredundancy we add, the more reliably we can detect and
correct errors, but the less efficient we become atcorrect errors, but the less efficient we become attransmitting the source data.transmitting the source data.
Err r ntr l ppli ti nError control applications
8/13/2019 1.lecture1
29/31
Error control applicationsError control applications
DataData commcommunicationunication networks (Ethernet,networks (Ethernet,
FDDI, WAN, Bluetooth)FDDI, WAN, Bluetooth) Satellite and Deep space communicationsSatellite and Deep space communications
Cellular mobile communicationsCellular mobile communications
ModemsModems
Computer busesComputer buses
Magnetic disks and tapesMagnetic disks and tapes CDs, DVDs. Digital sound needs ECC!CDs, DVDs. Digital sound needs ECC!
8/13/2019 1.lecture1
30/31
Error control categoriesError control categories
The error control problem can beThe error control problem can be classiclassifified ined in
several ways:several ways: Types of error control coding:Types of error control coding: detection vs. correctiondetection vs. correction
Types of errors: how muchTypes of errors: how much clusteringclustering-- random,random,burst etc.burst etc.
Types of codes:Types of codes: block vs. convolutionalblock vs. convolutional
8/13/2019 1.lecture1
31/31
Error Control StrategiesError Control Strategies
Error detectionError detection..
Goal: avoid accepting faulty data.Goal: avoid accepting faulty data.
Lost data may be unfortunate; wrong data mayLost data may be unfortunate; wrong data maybe disastrous.be disastrous.
((Forward) error correction (FEC or ECC).Forward) error correction (FEC or ECC).
Use redundancy in encoded message toUse redundancy in encoded message toestimate from the received dataestimate from the received data what messagewhat messagewas actually sent.was actually sent.
The best estimate is usually theThe best estimate is usually the closest"closest"message. The optimal estimatemessage. The optimal estimate is the messageis the messagethat is most probable given what is received.that is most probable given what is received.