+ All Categories
Home > Documents > Discrete Memoryless Source Final 2

Discrete Memoryless Source Final 2

Date post: 22-Nov-2015
Category:
Upload: nylnicart
View: 51 times
Download: 2 times
Share this document with a friend
Description:
Simon HaykinCommunication systemchapter 9
Popular Tags:
34
DISCRETE MEMORYLESS SOURCE Communication Systems by Simon Haykin Chapter 9 : Fundamental Limits in Information Theory
Transcript
  • DISCRETE MEMORYLESS SOURCE

    Communication Systems by Simon HaykinChapter 9 : Fundamental Limits in Information Theory

  • INTRODUCTION

    The purpose of a communication

    system is to carry information

    bearing baseband signals from

    one place to another over a

    communication channel.

  • INFORMATION THEORY

    It deals with mathematical modeling and analysis of a communication system rather than with physical sources and physical channel.

    It is a highly theoretical study of the efficient use of bandwidth to propagate information through electronic communications systems.

  • INFORMATION THEORY

    It provides answers to the two fundamental questions: What is the irreducible complexity below which

    a signal cannot be compressed?

    What is the ultimate transmission rate for reliable communication over a noisy channel?

    The answer to these questions lie in the ENTROPY of a source and the CAPACITY of

    a channel, respectively.

  • INFORMATION THEORY

    Entropy

    It is defined in terms of the probabilistic

    behavior of a source information.

    It is named in deference to the parallel use

    of this concept in thermodynamics.

    Capacity

    The intrinsic ability of a channel to convey

    information.

    It is naturally related to the noise

    characteristic of the channel.

  • INFORMATION THEORY

    A remarkable result that emerges from

    information theory is that

    if the entropy of the source is

    less than the capacity of the channel,

    then error free communication over

    channel

    can be achieved.

  • UNCERTAINTY, INFORMATION, AND

    ENTROPY

  • DISCRETE RANDOM VARIABLE, S

    Suppose that a probabilistic experiment involves

    the observation of the output emitted by a

    discrete source during every unit of time

    (signaling interval).

    The source output is modeled as a discrete

    random variable, S , which takes on symbols

    from a fixed finite alphabet:

    (9.1)

  • DISCRETE RANDOM VARIABLE, S

    with probabilities:

    (9.2)

    that must satisfy the condition:

    (9.3)

  • DISCRETE MEMORYLESS SOURCE

    Assuming that the symbols emitted by the

    source during successive signaling

    intervals are statistically independent.

    A source having such properties are called

    DISCRETE MEMORYLESS SOURCE, a

    memoryless in the sense that the

    symbol emitted at any time is

    independent of previous choices.

  • DISCRETE MEMORYLESS SOURCE

    Can we find a measure of how much

    information is produced by DISCRETE

    MEMORYLESS SOURCE?

    Note: idea of information is closely related

    to that of uncertainty or surprise

  • EVENT S = SK

    Consider the event S = sk[describing the emission of symbol sk by the source with a

    probability pk]

    Before the event occurs:

    >there is an amount of uncertainty.

    During the event:

    >there is an amount of surprise.

    After the event:

    > there is a gain in the amount of

    information, which is the resolution of uncertainty.

  • The amount of information is related to the

    inverse of the probability of occurrence.

    The amount of information gained after observing

    the event S = sk, which occurs with probability

    pk, is the logarithmic function

    (9.4)

    **base of logarithmic is arbitrary

    LOGARITHMIC FUNCTION

  • LOGARITHMIC FUNCTION

    This definition exhibits the following important

    properties that are intuitively satisfying:

    1. (9.5)

    If we are absolutely certain of the outcome of an event, even

    before it occurs, there is no information gained.

    2. (9.6)

    The occurrence of an event S= sk either provides some or no

    information, but never brings about a loss of information.

  • LOGARITHMIC FUNCTION

    3.

    (9.7)

    The less the probable an event is, the more

    information we gain when it occurs.

    4.

    if sk and sl are statistically independent.

  • BIT

    Using Equation 9.4 in logarithmic base 2.

    The resulting unit of information is

    called the bit (a contraction of binary

    digit).

    (9.8)

  • ONE BIT

    When pk=1/2, we have I(sk) = 1 bit.

    Hence, one bit is the amount of

    information that we gain when one

    of two possible and equally likely

    events occurs.

  • I(SK)

    The amount of information I(sk) produced by

    the source during an arbitrary signaling

    interval depends on the symbol sk emitted by

    the source at the time.

    Indeed I (sk) is a discrete random variable that

    takes on the values

    with probabilities ,

    respectively.

  • MEAN OF I(SK): ENTROPY

    The mean of I(sk) over the source

    alphabet is given by

    (9.9)

  • ENTROPY OF A DISCRETE MEMORYLESS

    SOURCE

    The important quantity H (S ) is called

    the entropy of a discrete memory less

    source with source alphabet.

    It is a measure of the average

    information content per source symbol.

    It depends only on the probabilities of

    the symbols in the alphabet S of the

    source.

  • SOME PROPERTIES OF ENTROPY

    A discrete memory less source whose

    mathematical model is defined by

    equations 9.1 & 9.2. The entropy H(l) of

    such source is bounded as follows:

    (9.10)

    where K is the radix of the alphabet of the

    source.

  • SOME PROPERTIES OF ENTROPY

    Furthermore, we may make two statements:

    1. H(S )= 0, if and only if the probability pk= 1

    for some k, and the remaining probabilities in

    the set are all zero; this lower bound on

    entropy corresponds to no uncertainty.

    2. H(S )= log K, if and only if pk =1/K for all k;

    this upper bond on entropy corresponds to

    maximum uncertainty.

  • EXAMPLE 9.1 ENTROPY OF BINARY MEMORY LESS SOURCE

    Consider a binary memory less source for which

    symbol 0 occurs with probability p0 and symbol

    1 with probability p1= 1 - p0, with entropy of:

    (9.15)

  • EXAMPLE 9.1 SOLUTION

    For which we observe the following:

    1. When p0= 0, the entropy H(S ) =0; this

    follows from the fact that x log x0 as x0.2. When p0= 1, the entropy H (S ) = 0.

    3. The entropy H (S ) attains its maximum

    value, Hmax=1 bit, when p1 = p0 =1/2, that is,

    symbol 1 and 0 are equally probable.

  • EXAMPLE 9.1 SOLUTION

    The function p0 is frequently encountered in

    information theoretic problems, and defined

    as:

    (9.16)

    This function is called as the entropy function.

    This is a function of prior probability p0 defined

    on the interval [0,1].Plotting the entropy

    function H(p0) versus p0 defined on the

    interval [0,1] as in Figure 9.2.

  • FIGURE 9.2 ENTROPY FUNCTION

    The curve highlights the observations made

    under points 1,2, and 3.

  • EXTENSION OF DISCRETE MEMORYLESS

    SOURCE

    -Consider blocks rather than individual symbols

    -Each block consisting of n successive source

    symbols.

    (9.17)

    the probability of a source symbol S is equal

    to the product of the probabilities of the n

    source symbols in S constituting the

    particular symbol in S .

  • EXAMPLE 9.2 ENTROPY OF EXTENDED SOURCE

    Consider a discrete source with source

    alphabet

    S = {s0, s1, s2} with respective

    probabilities:

    p0 = 1/4

    p1 = 1/4

    p2 = 1/2.

    Find the entropy of the extended source.

  • EXAMPLE 9.2 : SOLUTION

    The entropy of the source is:

  • EXAMPLE 9.2 : SOLUTION

    Consider next the second order extension of the

    source.

    With the source alphabet S consisting of three

    symbols, it follows that the source has nine

    symbols.

    Table 9.1 present the nine symbols, its

    corresponding sequences, and its

    probabilities.

  • Table 9.1

    Alphabet particulars of second-order extension of a

    discrete memoryless source

    Symbols of

    S 20 1 2 3 4 5 6 7 8

    Correspondin

    g sequences

    of symbols of

    S

    s0s0 s0s1 s0s2 s1s0 s1s1 s1s2 s2s0 s2s1 s2s2

    Probability

    p (i ), i = 0, 1, ... , 8

    1/16 1/16 1/8 1/16 1/16 1/8 1/8 1/8 1/4

  • EXAMPLE 9.2 : SOLUTION

    The entropy of the extended source is:

  • EXAMPLE 9.2 : SOLUTION

    The entropy of the extended source is:

    Which proves:

  • Presented by Roy Sencil and Janyl Jane Nicart

    END OF PRESENTATION


Recommended