+ All Categories
Home > Documents > UNIT - IV_ppt

UNIT - IV_ppt

Date post: 03-Apr-2018
Category:
Upload: shanmugapriyavinodkumar
View: 214 times
Download: 0 times
Share this document with a friend

of 18

Transcript
  • 7/29/2019 UNIT - IV_ppt

    1/18

    UNIT - IV

    COMPRESSION TECHNIQUES

  • 7/29/2019 UNIT - IV_ppt

    2/18

    Need for Data compression / Advantages

    Huge amount of data is generated in text,

    images, audio, speech and video.

    Because of compression. Transmission data

    rate is reduced

    Storage becomes less due to compression.

    Due to video compression it is possible to

    store one complete movie on two cds. Transportation of the data is easier

  • 7/29/2019 UNIT - IV_ppt

    3/18

    Drawbacks

    Due to compression, some of the data

    is lost

    Compression and Decompression

    increases complexity of the transmitter

    and receiver

    Coding time is increased due tocompression and decompression.

  • 7/29/2019 UNIT - IV_ppt

    4/18

    Infn Source

    Principles of Data

    Compression

    Compression Decompression

    Source

    Encoder

    Destination

    DecoderReceiverN/W

  • 7/29/2019 UNIT - IV_ppt

    5/18

    Lossless Compression and Lossy

    Compression

    Lossless Compression

    No part of the original information is lost

    during compression

    Lossy Compression

    Some information lost during

    compression.

  • 7/29/2019 UNIT - IV_ppt

    6/18

    Comparison between Lossless

    and Lossy Compression

    Sr.

    No.Lossless Compression Lossy Compression

    1 No Information is lost Some information is lost

    2 Completely reversible It is not reversible

    3 Used for text and data Used for speech and video

    4 Compression ratio is less High compression ratio

    5 Compression isindependent of human

    response

    Compression depends uponsensitivity of human ear, eyes

    etc.

    6 Huffman coding, Run length

    coding are examples

    Transform coding, vector

    quantization are examples

  • 7/29/2019 UNIT - IV_ppt

    7/18

    Entropy Coding

    Entropy Coding is based on entropy ofthe source

    It assign codes to the source

    alphabets according to probability oftheir occurrence.

    It is Lossless Compression

    Ex. Runlength coding, Prefix codingand Huffman Coding.

    They are used for compression of thetext files.

  • 7/29/2019 UNIT - IV_ppt

    8/18

    Runlength Coding

    Used for the data generated byscanning the documents, fax machine,typewriters etc.

    These information sources produce adata that contains large strings of1s/0s and zeros.

    1111110000000011110000..

    The above string coded usingRunlength coding as

    1,6 ; 0,8 ; 1,4 ; 0,4

  • 7/29/2019 UNIT - IV_ppt

    9/18

    Statistical Encoding Exploits the statistical properties of the

    information For e.g the alphabets e,a,i have

    higher probabilities of occurrence

    compared to alphabets like q,t,z etc. Huffman Coding is an example of

    Statistical encoding.

    Here shortlength codewords areassigned to frequently occurringalphabets and larger length codewordsare assigned to rarely occurring

    alphabets. This is called also as Entropy

  • 7/29/2019 UNIT - IV_ppt

    10/18

    Source Encoding

    Source Encoding is based onparticular property of the source.

    Examples

    Differential Encoding

    Transform Encoding

  • 7/29/2019 UNIT - IV_ppt

    11/18

    Differential Encoding

    The difference between twosuccessive samples is encoded.

    Normally the values of samples are

    large but the difference between themis very small.

    Hence less number of bits are rquired

    to encode the difference. DPCM and Delta Modulation are

    based on this principle.

  • 7/29/2019 UNIT - IV_ppt

    12/18

    Transform Encoding

    Transform Coding is much power fullcoding technique.

    Consider an image consisting of NxN

    pixel size. If these pixels are scanned

    horizontally, then an electric signal

    generated. The frquency of this signal is called

    Spatial Frequency.

  • 7/29/2019 UNIT - IV_ppt

    13/18

    Transform Encoding Contd Human eye much sensitive to low spatial

    frequencies compared to high spatialfrequencies.

    Hence such higher sensitive components are

    redundant and they can be removed. This removal of high frequency components

    provides compression, since the overall size

    of the data is reduced.

    Conversion of the image in spatial frequency

    domain is obtained with the help of

    DCT(Discrete Cosine Transform).

    When Thresholding applied, some of the

  • 7/29/2019 UNIT - IV_ppt

    14/18

    Text Compression

    Text Compression should be strictly

    lossless. Text Compression cannot be lossy.

    Therefore lossless compression

    techniques such as entropy coding isused.

    Two types of statistical encoding

    methods 1) Huffman Coding and Arithmetic Coding

    Optimum set of codewords are derived for single

    characters

    2) Lampel ziv (LZ) algorithm

  • 7/29/2019 UNIT - IV_ppt

    15/18

    The Coding used for text can be Static

    or Dynamic

    1) Static Coding The code words assigned to the alphabets

    does not change during compression.

    2) Dynamic Coding The code words are dynamically computed

    during compression. The code word for a

    particular alphabet or string does not

    remain fixed throughout the compression.

    Also called Adaptive Coding

  • 7/29/2019 UNIT - IV_ppt

    16/18

    Sr.

    No.

    Static Coding Dynamic Coding

    1

    Codewords are fixed

    throughout

    compression

    Codewords change

    dynamically during

    compression

    2Statisticalcharacteristics of the

    data are known

    Statistical characteristicsof the data are not

    known

    3

    Receiver knows the set

    of codewords

    Receiver dynamically

    calculates thecodewords

    4Ex. Static Huffman

    Coding

    Ex. Dynamic Huffman

    Coding

  • 7/29/2019 UNIT - IV_ppt

    17/18

    Static Huffman Coding

    In Static Huffman Coding, the characterstring to be transmitted is analyzed.

    The frequency of occurrence of each

    character is determined. The variable length codewords are then

    assigned to each character.

    This coding operation creates anunbalanced tree. It is also called

    Huffman coding tree.

  • 7/29/2019 UNIT - IV_ppt

    18/18

    Arithmetic Coding


Recommended