+ All Categories
Home > Documents > Info Theory

Info Theory

Date post: 22-Apr-2017
Category:
Upload: amanurrahman
View: 214 times
Download: 1 times
Share this document with a friend
15
By Saima Israil Feb 20th, 2014
Transcript
Page 1: Info Theory

BySaima Israil

Feb 20th, 2014

Page 2: Info Theory

Introduction General Model of Communication Shannon’s Information Theory Source Coding theorem Channel Coding Theory Information Capacity Theorem Conclusion

Page 3: Info Theory

In defining information, Shannon identified the critical relationships among theelements of a communication system the power at the source of a signal the bandwidth or frequency range of an information channel through which the signal travelsthe noise of the channel, such as unpredictable static on a radio, which will alter the signal by the time it reaches the last element of the Systemthe receiver, which must decode the signal.

Page 4: Info Theory
Page 5: Info Theory

all communication involves three steps

Coding a message at its source Transmitting the message through a communications channel

Decoding the message at its destination.

Page 6: Info Theory

Noise can be considered data without meaning; that is, data that is not being used to transmit a signal, but is simply produced as an unwanted by-product of other activities. Noise is still considered information, in the sense of Information Theory.

Page 7: Info Theory

A quantitative measure of the disorder of a system and inversely related to the

amount of energy available to do work in an isolated system. The more energy has become dispersed, the less work it can perform and the greater the entropy.

Page 8: Info Theory

The theorem can be stated as follows:Given a discrete source of entropy H(S) , the average code-word length L for any distortionless source coding is bounded as

L > H(S)This theorem provides the mathematical tool for assessing data compaction

The source coding theorem is also known as the "noiseless coding theorem" in the sense that it establishes the condition for error-free encoding to be possible

Page 9: Info Theory

For reliable communication , needs channel encoding & decoding.“any coding scheme which gives the error as small as possible, and which is efficient enough that code rate is not too small?”

Let Source with alphabet X have entropy H(X) and produce symbols once every Ts, and Channel have capacity C and be used once every Tc . Then,

i) if , there exists a coding scheme.

ⅱ) if , it is not possible to transmit with arbitrary small error.

cs TC

TXH

)(

cs TC

TXH

)(

Page 10: Info Theory

This Theorem defines an upper limit to the capacity of a link, in bits per second (bps), as a function of the available bandwidth and the signal-to-noise ratio of the link.

The Theorem can be stated as:C = B * log2(1+ S/N)

where C is the achievable channel capacity, B is the bandwidth of the line, S is the average signal power and N is the average noise power.

Page 11: Info Theory

The phrase signal-to-noise ratio, often abbreviated SNR or S/N, is the ratio between the magnitude of a signal (meaningful information) and the magnitude of background noise.

The signal-to-noise ratio (S/N) is usually expressed in decibels (dB) given by the formula:

10 * log10(S/N)so for example a signal-to-noise ratio of 1000 is commonly expressed as

10 * log10(1000) = 30 dB.

Page 12: Info Theory

2

/

Define ideal system as , where is the Tx energy per bit

Then, log (1 )

2 1 /

For infinite bandwidth channel

ln 2 0.693 1.6

b

b b

b

oC B

b

o

b

o B

R CP E C E

E CCB N B

EN C B

EdB

N

2 lim logB o

PC C eN

-10 0 10 20 30 40 500.1

1

10

20

Eb/No(dB)

C/B

-1.6

Rb < C

Rb > C

Rb = C

Shannon's limit

Page 13: Info Theory

If the SNR is 20 dB, and the bandwidth available is 4 kHz, which is appropriate for telephone

communications, then C = 4 log2(1 + 100) = 4 log2 (101) = 26.63 kbit/s. Note that the value of

100 is appropriate for an SNR of 20 dB.

Page 14: Info Theory

If it is required to transmit at 50 kbit/s, and a bandwidth of 1 MHz is used, then the minimum SNR required is given by 50 = 1000 log2(1+S/N) so S/N = 2C/W -1 = 0.035 corresponding to an SNR of -14.5 dB. This shows that it is possible to transmit using signals which are actually much weaker than the background noise level.

Page 15: Info Theory

o Shannon’s Information Theory provide us the basis for the field of Information Theory

o Source coding is bounded as L > H(S)

o if , there exists a coding scheme.

o Channel capacity C = B * log2(1+ S/N)

cs TC

TXH

)(


Recommended