1 / 10

SWE 423: Multimedia Systems

SWE 423: Multimedia Systems. Chapter 7: Data Compression (3). Outline. Entropy Encoding Arithmetic Coding. Entropy Encoding: Arithmetic Coding. Initial idea introduced in 1948 by Shannon Many researchers worked on this idea

bonita
Download Presentation

SWE 423: Multimedia Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SWE 423: Multimedia Systems Chapter 7: Data Compression (3)

  2. Outline • Entropy Encoding • Arithmetic Coding

  3. Entropy Encoding: Arithmetic Coding • Initial idea introduced in 1948 by Shannon • Many researchers worked on this idea • Modern arithmetic coding can be attributed to Pasco (1976) and Rissanen and Langdon (1979) • Arithmetic coding treats the whole message as one unit • In practice, the input data is usually broken up into chunks to avoid error propagation

  4. Entropy Encoding: Arithmetic Coding • A message is represented by a half-open interval [a,b), where a,b. • General idea of encoding • Map the message into an open interval [a,b) • Find a binary fractional number with minimum length that belongs to the above interval. This will be the encoded message • Initially, [a,b) = [0,1) • When the message becomes longer, the length of the interval shortens and the # of bits needed to represent the interval increases

  5. Entropy Encoding: Arithmetic Coding • Coding Algorithm Algorithm ArithmeticCoding // Input: symbol: Input stream of the message terminator: terminator symbol // : Low[] and High[]: all symbols’ ranges // Output: binary fractional code of the message low = 0; high = 1; range = 1; while (symbol != terminator) { get (symbol); high = low + range * High(symbol); low = low + range * Low(symbol); range = high – low; } return CodeWord(low,high);

  6. Entropy Encoding: Arithmetic Coding • Binary code generation Algorithm CodeWord // Input: low and high // Output: binary fractional code code = 0; k = 1; while (value(code) < low) { assign 1 to the kth binary fraction bit; if (value(code) > high) replace the kth bit by 0; k++; }

  7. Entropy Encoding: Arithmetic Coding • Example: Assume S = {A,B,C,D,E,F,$}, where $ is the terminator symbol. In addition, assume the following probabilities for each character: • Pr (A) = 0.2 • Pr(B) = 0.1 • Pr(C) = 0.2 • Pr(D) = 0.05 • Pr(E) = 0.3 • Pr(F) = 0.05 • Pr($) = 0.1 Generate the fractional binary code of the message CAEE

  8. Entropy Encoding: Arithmetic Coding • It can be proven that log2 (1/ Pi) is the upper bound on the number of bits needed to encode a message • In our case, the maximum is equal to 12. • When the length of the message increases, the range decreases and the upper bound value ...... • Generally, arithmetic coding outperforms Huffman coding • Treats the whole message as one unit vs. an integral number of bits to code each character in Huffman coding • Redo the previous example CAEE$ using Huffman coding and notice how many bits are required to code this message.

  9. Entropy Encoding: Arithmetic Coding • Decoding Algorithm Algorithm ArithmeticDecoding // Input: code: binary code // : Low[] and High[]: all symbols’ ranges // Output: The decoded message value = convert2decimal(code); Do { find a symbol s so that Low(s) <= value < High(s); output s; low = Low(s); high = High(s); range = high – low; value = (value – low) / range; } while s is not the terminator symbol;

  10. Entropy Encoding: Arithmetic Coding • Example

More Related