+ All Categories
Home > Documents > Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Date post: 13-Dec-2015
Category:
Upload: andrea-day
View: 219 times
Download: 1 times
Share this document with a friend
Popular Tags:
37
Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY
Transcript
Page 1: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 1

MODULE 2

ENTROPY

Page 2: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 2

OBJECTIVES:

1. Define Information

2. Discuss the Characteristics of Information

3. Introduce Types of Information Sources and Types of Communication System

4. Explain Information Theory

Page 3: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 3

INFORMATION

- Is defined as knowledge or intelligence communicated or received

- a numerical quantity that measures the uncertainty in the outcome of an experiment to be performed

Page 4: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 4

CHARACTERISTICS OF INFORMATION

1. Only the quantity of information and its integrity are important, not the meaning

2. No information is transmitted by a continuous symbol

3. Information requires change

Page 5: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 5

TYPES OF INFORMATION SOURCES

1. Analog Information Source

- produces messages that are defined on a continuum

2. Digital Information Source

- produces a finite set of possible messages

Page 6: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 6

UNITS OF INFORMATION

1. Bits

2. Dits

3. Nats

Page 7: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 7

INFORMATION TRANSFER RATE- number of binary digits (bits) that is

transmitted in unit of time

- is expressed in bits per second

- often called bit rate

Page 8: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 8

SIGNALLING RATE

- is the rate at which transmission changes occur

- specifies how fast the signal states change in a communication channel

- often called baud rate

Page 9: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 9

CHARACTERISTICS OF A SQUARE WAVE1. Has a frequency spectrum that contains the

fundamental frequency and all its odd harmonics

2. The magnitude of the harmonic components decreases as the order goes up.

3. In practice, significant amplitude of the harmonics is limited to about 9

Page 10: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 10

CHARACTERISTICS OF A PULSE TRAINS1. Has a spectrum with many components

2. Frequencies of these components and the bandwidth required to convey the signal depend on the pattern of the bits in the pulse train

3. The pattern that requires the greatest bandwidth is that where alternate bitschange state

Page 11: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 11

THINGS TO REMEMBER

1. A practical communication channel has to be capable of conveying any pattern of data transmitted

2. The absolute minimum bandwidth of a channel is theoretically the fundamental frequency of the

square waveform

3. In practice, the bandwidth used is greater than this absolute minimum

Page 12: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 12

DATA

- a form of information that is suitable for storage in or by processing by a

computer

Page 13: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 13

INFORMATION THEORY

- ideal amount of data that should be transmitted to enable the data to be efficiently transmitted without transmitting the redundant data

Page 14: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 14

PROPERTIES OF QUANTITATIVE MEASURE OF INFORMATION1. If a particular message is known by the user prior

to being transmitted, the message contains zero information.

2. If potential messages from a source are all equally likely, then the information contained in each particular message should be equal to the number of “1”s and “0 s required to uniquely identify the message.

3. If two potential messages are not equally likely messages, the one with lesser probability contains the greater amount of information.

Page 15: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 15

I. INFORMATION MEASURE (Ii)

The information sent from a digital source when the ith message is transmitted is given by:

where:Pi - probability of transmitting the ith message

Page 16: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 16

EXAMPLE 1

Suppose that equal numbers of letter grades A, B, C, D, and F are given in a certain course. How much information in bits have you received when the instructor tells you that your grade is:

a. not F?

b. Either A or B

c. repeat a and b (solve amount of information in terms of dits)

Page 17: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 17

EXAMPLE 2

A card is drawn at random from an ordinary deck of 52 playing cards. Find

a) the information in bits that you receive when

you are told that the card is a heart

b) a face card

c) a heart face card

Page 18: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 18

EXAMPLE 3

Find the information content of message that consists of a digital word 12 digits long in which each digit may take on one of four possible levels. The probability of sending any of the four levels is assumed to be equal, and the level in any digit does not depend on the values taken on by previous digits.

Page 19: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 19

EXAMPLE 4

Consider a source flipping a coin. How much information is contained in the message “the coin landed heads up”?

Page 20: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 20

EXAMPLE 5

Consider a fast-food restaurant in which a customer is nine times as likely to order a hamburger as a fish sandwich. How much information is contained in the message “the customer wants a hamburger?” How much information is contained in the message “the customer wants a fish sandwich?”

Page 21: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 21

EXAMPLE 6

How much information is contained in the message “you are reading this example”?

Page 22: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 22

II. AVERAGE INFORMATION

- Average information content of a message from a particular source.

- expected symbols per second

Page 23: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 23

III. RELATIVE ENTROPY

The ratio of the entropy of a source to the maximum value the entropy could take for the same source symbol

where:Hmax = log b NN = total number of symbols

Page 24: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 24

IV. REDUNDANCY

Page 25: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 25

V. RATE OF INFORMATION

The ratio of the entropy of a source to the maximum value the entropy could take for the same source symbol

Page 26: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 26

EXAMPLE 1

A telephone touch-tone keypad has the digits 0 to 9, plus the * and # keys. Assume the probability of sending * and # is 0.005 and the probability of sending 0 to 9 is 0.0999 each. If the keys are pressed at a rate of 2 keys/sec, compute the entropy and data rate for this source.

Page 27: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 27

EXAMPLE 2

Determine the following:a) Entropyb) Relative Entropyc) Rate of Information

Page 28: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 28

EXAMPLE 3 Determine the ideal number of bits that should be

allocated to each of the following characters with theprobabilities given.

Page 29: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 29

EXAMPLE 4

Consider a transmission that is to transmit the first 6 characters of the alphabet only. Each will be expressed as a digital signal. By convention, each letter would be allocated 3 bits.

Find the entropy to determine the most economical way for transmitting the data.

Page 30: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 30

EXAMPLE 5

Suppose our fast-food restaurant serves an average of eight customers per minute. What is the information rate of the food orders?

Page 31: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 31

EXAMPLE 6

Suppose that in the fast-food restaurant mentioned previously each customer orders either one hamburger or one fish sandwich. What is the average information content in a customer’s order?

Page 32: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 32

VI. OTHER PARAMETERS

Parameter Equation

1. Code Word Length

2. Average Code Word Length

Page 33: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 33

VI. OTHER PARAMETERS

Parameter Equation

3. Coding Efficiency

4. Coding Redundancy

Page 34: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 34

EXAMPLE

Calculate the coding efficiency in representing the 26 letters of the alphabet using a binary and decimal system

Page 35: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 35

SEATWORK

1. Calculate H(x) for a discrete memory-less channel having six symbols with probabilities:

P(A) =1/2

P(B) =1/4

P(C) = 1/8

P(D) = P(E) = 1/20

P(F) = 1/40

Find the amount of information contained in the messages BAD, BED, BEEF, CAB, FACE, BABAE, ABACADA, BEAD, FADE

2. Determine the entropy for the word “YABBBADDDABBBADDDOOOOO”.

Page 36: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 36

SEATWORK

2. Suppose a source emits r = 2000 symbols/sec selected from an alphabet size of M = 4 with symbol probability xi listed below. Find information rate.

Page 37: Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Prepared by: Engr. Jo-Ann C. Viñas 37

ASSIGNMENT

Answer Problem 9.1 and 9.7(a-b)

Communication Systems Analysis and Design

by Harold P.E. Stern & Samy A. Mahmoud


Recommended