+ All Categories
Home > Documents > Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function...

Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function...

Date post: 31-Jul-2020
Category:
Upload: others
View: 8 times
Download: 0 times
Share this document with a friend
21
Coding and Information Theory Chapter 3 Entropy Xuejun Liang 2018 Fall
Transcript
Page 1: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Coding and Information TheoryChapter 3EntropyXuejun Liang

2018 Fall

Page 2: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Chapter 3: Entropy

3.1 Information and Entropy

3.2 Properties of the Entropy Function

3.3 Entropy and Average Word-length

3.4 Shannon-Fane Coding

3.5 Entropy of Extensions and Products

3.6 Shannon's First Theorem

3.7 An Example of Shannon's First Theorem

Page 3: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

The aim of this chapter

• Introduce the entropy function • which measures the amount of information emitted by a

source

• Examine the basic properties of this function

• Show how it is related to the average word lengths of encodings of the source

Page 4: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

3.1 Information and Entropy

• Define a number 𝐼(𝑠𝑖), for each 𝑠𝑖 ∈ 𝑆, which represents• How much information is gained by knowing that 𝑆 has

emitted 𝑠𝑖• Our prior uncertainty as to whether 𝑠𝑖 will be emitted

and our surprise on learning that it has been emitted

• Therefore require that:1) 𝐼(𝑠𝑖) is a decreasing function of the probability 𝑝𝑖 of

𝑠𝑖, with 𝐼(𝑠𝑖) = 0 if 𝑝𝑖 = 1;

2) 𝐼(𝑠𝑖𝑠𝑗) = 𝐼 𝑠𝑖 + 𝐼(𝑠𝑗), where 𝑆 emits 𝑠𝑖 and 𝑠𝑗consecutively and independently.

Page 5: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Entropy Function

• We define

where 𝑝𝑖 = Pr 𝑠𝑖 . So that 𝐼 satisfies (1) and (2)

• Example 3.1• Let 𝑆 be an unbiased coin, with 𝑠1 and 𝑠2 representing

heads and tails. Then 𝐼 𝑠1 =? and 𝐼 𝑠2 =?

Page 6: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

The 𝑟-ary Entropy of 𝑆

• The average amount of information conveyed by 𝑆(per source-symbol) is given by the function

• Called the 𝑟-ary entropy of 𝑆.

• Base 𝑟 is often omitted

Page 7: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Examples

• Example 3.2• Let 𝑆 have 𝑞 = 2 symbols, with

probabilities 𝑝 and 1 - 𝑝• Let ҧ𝑝 = 1 − 𝑝. Then

• 𝐻(𝑝) is maximal when 𝑝 = ½• Compute 𝐻(𝑝) when 𝑝 = ½ and 𝑝 = 2⁄3

• Example 3.3• If 𝑆 has 𝑞 = 5 symbols with probabilities 𝑝𝑖 = 0.3, 0.2, 0.2,

0.2, 0.1, as in §2.2, Example 2.5, we find that 𝐻2 𝑆 =2.246.

Page 8: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Examples (Cont.)

• If 𝑆 has 𝑞 equiprobable symbols, then 𝑝𝑖 = Τ1 𝑞 for each 𝑖, so

• Example 3.4 and 3.5• Let 𝑞 = 5, 𝐻2 𝑆 = 𝑙𝑜𝑔25 ≈ 2.321

• Let 𝑞 = 6, 𝐻2 𝑆 = 𝑙𝑜𝑔26 ≈ 2.586

• Example 3.6.• Using the known frequencies of the letters of the

alphabet, the entropy of English text has been computed as approximately 4.03.

Page 9: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Compare average word-length of binary Huffman coding with entropy

• As in Example 3.2 with 𝑝 = Τ2 3

• 𝐻2(𝑆) ≈ 0.918

• 𝐿 𝐶1 ≈ 1, 𝐿 𝐶2 /2 ≈ 0.944, 𝐿 𝐶3 /3 ≈ 0.938

• As in Example 3.3• 𝐻2(𝑆) ≈ 2.246

• 𝐿 𝐶1 ≈ 2.3

• As in Example 3.4• 𝐻2(𝑆) ≈ 2.246

• 𝐿 𝐶1 ≈ 2.321

Page 10: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

3.2 Properties of the Entropy Function

• Theorem 3.7• 𝐻𝑟(𝑆) ≥ 0, with equality if and only if 𝑝𝑖 = 1 for some 𝑖

(so that 𝑝𝑗 = 0 for all 𝑗 ≠ 𝑖).

• Lemma 3.8• For all 𝑥 > 0 we have ln 𝑥 ≤ 𝑥 − 1, with equality if and

only if 𝑥 = 1.

• Converting to some other base 𝑟, we have𝑙𝑜𝑔𝑟(𝑥) ≤ 𝑙𝑜𝑔𝑟(𝑒) ∙ 𝑥 − 1

with equality if and only if 𝑥 = 1.

Page 11: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Properties of the Entropy Function

• Corollary 3.9 • Let 𝑥𝑖 ≥ 0 and 𝑦𝑖 > 0 for 𝑖 =1, ..., q, and let σ𝑖 𝑥𝑖 = σ𝑖 𝑦𝑖 = 1 (so (𝑥𝑖) and (𝑦𝑖) are probability distributions, with 𝑦𝑖 ≠ 0). Then

• (that is, σ𝑖 𝑥𝑖log( Τ𝑦𝑖 𝑥𝑖) ≤ 0), with equality if and only if 𝑥𝑖 = 𝑦𝑖 for all 𝑖.

• Theorem 3.10• If a source 𝑆 has 𝑞 symbols then 𝐻𝑟(𝑆) ≤ 𝑙𝑜𝑔𝑟𝑞, with

equality if and only if the symbols are equiprobable.

Page 12: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

3.3 Entropy and Average Word-length

• Theorem 3.11• If 𝐶 is any uniquely decodable 𝑟-ary code for a source 𝑆,

then 𝐿(𝐶) ≥ 𝐻𝑟(𝑆).

• The interpretation• Each symbol emitted by 𝑆 carries 𝐻𝑟(𝑆) units of

information, on average.

• Each code-symbol conveys one unit of information, so on average each code-word of 𝐶 must contain at least 𝐻𝑟(𝑆) code-symbols, that is, 𝐿(𝐶) ≥ 𝐻𝑟(𝑆).

• In particular, sources emitting more information require longer code-words.

Page 13: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Entropy and Average Word-length (Cont.)

• Corollary 3.12• Given a source 𝑆 with probabilities 𝑝𝑖, there is a

uniquely decodable 𝑟-ary code 𝐶 for 𝑆 with 𝐿 𝐶 =𝐻𝑟(𝑆) if and only if 𝑙𝑜𝑔𝑟(𝑝𝑖) is an integer for each 𝑖 , that is, each 𝑝𝑖 = 𝑟𝑒𝑖 for some integer 𝑒𝑖 ≤ 0.

• Example 3.13• If 𝑆 has 𝑞 = 3 symbols 𝑠𝑖, with probabilities 𝑝𝑖 = 1⁄4,1⁄2,

and 1⁄4 (see Examples 1.2 and 2.1).

• 𝐻2 𝑆 =

• A binary Huffman code 𝐶 for 𝑆:

• 𝐿 𝐶 =

Page 14: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

More examples

• Example 3.14• Let 𝑆 have 𝑞 = 5 symbols, with probabilities 𝑝𝑖 =0.3, 0.2, 0.2, 0.2, 0.1, as in Example 2.5.• In Example 3.3, 𝐻2(𝑆) = 2.246, and • in Example 2.5, 𝐿 𝐶 = 2.3, 𝐶 binary Huffman code for 𝑆

• By Theorem 2.8, every uniquely decodable binary code 𝐶 for 𝑆 satisfies 𝐿 𝐶 ≥ 2.3 > 𝐻2(𝑆).

• Thus no such code satisfies 𝐿 𝐶 = 𝐻𝑟(𝑆)

• What is the reason?

• Example 3.15• Let 𝑆 have 3 symbols 𝑠𝑖, with probabilities 𝑝𝑖 =

1

2, 1

2, 0.

Page 15: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Code Efficiency and Redundancy

• If 𝐶 is an 𝑟-ary code for a source 𝑆, its efficiency is defined to be

• So 0 ≤ 𝜂 ≤ 1 for every uniquely decodable code 𝐶 for 𝑆

• The redundancy of 𝐶 is defined to be ҧ𝜂 = 1 − 𝜂.• Thus increasing redundancy reduces efficiency

• In Examples 3.13 and 3.14, • 𝜂 = 1 and 𝜂 ≈ 0.977, respectively.

Page 16: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

3.4 Shannon-Fano Coding

• Shannon-Fano codes • close to optimal, but easier to estimate their average

word lengths.

• A Shannon-Fano code 𝐶 for 𝑆 has word lengths

• So, we have So Theorem 1.20 (Kraft's inequality) implies that there is an instantaneous 𝑟-ary code 𝐶 for 𝑆 with these word-lengths 𝑙𝑖

Page 17: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Shannon-Fano Coding (Cont.)

• Theorem 3.16• Every 𝑟-ary Shannon-Fano code 𝐶 for a source 𝑆 satisfies

• Corollary 3.17• Every optimal 𝑟-ary code 𝐷 for a source 𝑆 satisfies

• Compute word length 𝑙𝑖 of Shannon-Fano Code

Page 18: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

Examples

• Example 3.18• Let 𝑆 have 5 symbols, with probabilities 𝑝𝑖= 0.3, 0.2, 0.2,

0.2, 0.1 as in Example 2.5

• Compute Shannon-Fano code word length 𝑙𝑖, 𝐿(𝐶), 𝜂.

• Compare with Huffman code.

• Example 3.19• If 𝑝1 = 1 and 𝑝𝑖 = 0 for all 𝑖 > 1, then 𝐻𝑟 𝑆 = 0. An 𝑟-ary

optimal code 𝐷 for 𝑆 has average word-length 𝐿 𝐷 =1, so here the upper bound 1 + 𝐻𝑟 𝑆 is attained.

Page 19: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

3.5 Entropy of Extensions and Products

• Recall from §2.6 • 𝑆𝑛 has 𝑞𝑛 symbols 𝑠𝑖1 …𝑠𝑖𝑛with probabilities 𝑝𝑖1 …𝑝𝑖𝑛 .

• Theorem 3.20• If 𝑆 is any source then 𝐻𝑟(𝑆

𝑛) = 𝑛𝐻𝑟(𝑆).

• Lemma 3.21• If 𝑆 and 𝑇 are independent sources then 𝐻𝑟 𝑆 × 𝑇 =𝐻𝑟 𝑆 + 𝐻𝑟(𝑇)

• Corollary 3.22• If 𝑆1, … , 𝑆𝑛 are independent sources then

Page 20: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

3.6 Shannon's First Theorem

• Theorem 3.23• By encoding 𝑆𝑛 with 𝑛 sufficiently large, one can find

uniquely decodable 𝑟-ary encodings of a source 𝑆 with average word-lengths arbitrarily close to the entropy 𝐻𝑟(𝑆).

• Recall that• if a code for 𝑆𝑛 has average word-length 𝐿𝑛, then as an

encoding of 𝑆 it has average word-length Τ𝐿𝑛 𝑛.

• Note that• the encoding process of 𝑆𝑛 for a large 𝑛 are complicated

and time-consuming. • the decoding process involves delays

Page 21: Coding and Information Theory · 3.1 Information and Entropy 3.2 Properties of the Entropy Function 3.3 Entropy and Average Word-length 3.4 Shannon-Fane Coding 3.5 Entropy of Extensions

3.7 An Example of Shannon's First Theorem

• Let 𝑆 be a source with two symbols 𝑠1, 𝑠2 of probabilities 𝑝𝑖 = Τ2 3, Τ1 3, as in Example 3.2.• In §3.1, we have

• In §2.6, using binary Huffman codes for 𝑆𝑛 with 𝑛 = 1, 2 and 3, we have

• For larger 𝑛 it is simpler to use Shannon-Fano codes, rather than Huffman codes.• Compute 𝐿𝑛 for 𝑆𝑛

• Verify Τ𝐿𝑛 𝑛 → 𝐻2(𝑆)

• You will need to use this formula


Recommended