1 / 22

Context Clustering in Lossless Compression of Gray-scale Image

Context Clustering in Lossless Compression of Gray-scale Image. Mantao Xu and Pasi Fränti. UNIVERSITY OF JOENSUU DEPARTMENT OF COMPUTER SCIENCE JOENSUU, FINLAND. Problem. L ossless compression of gray-scale images addressed from the point of view of the achievable compression ratio.

nydia
Download Presentation

Context Clustering in Lossless Compression of Gray-scale Image

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Context Clustering in LosslessCompression of Gray-scale Image Mantao Xu andPasi Fränti UNIVERSITY OF JOENSUU DEPARTMENT OF COMPUTER SCIENCE JOENSUU, FINLAND

  2. Problem Losslesscompression of gray-scale images addressed fromthe point of view of the achievablecompression ratio

  3. Related works • LOCO-I: • Fixed Predictor: Median Edge Detector • fixed-sized context quantizer and bias cancellation • Golomb-Rice coder: still some difference bwteen the data rate and entropy; separation between modelling and entropy coding is not very clear • CALIC: • Adaptive predictor: Gradient-Adjusted Predictor • fixed-size context quantizer and cancellation • adaptive m-ary arithmetic coder

  4. Motivation • Investigation on an algorithm that uses : • Variable-size Scalar Quantizations • Achieving data rate close to the entorpy • Reduction of probabilities storage • A clear separation between modelling and entropy coding

  5. LOCO-I Med-predictor

  6. Histogram of image pixels

  7. Histogram of predicted error pixels

  8. Context modelling • Context definition: • g1=d – b,g2=b - c, g3=c – a • Context quantization: • Lloyd-Max scalar quantizer • Varible numbers of quantization Levels: • 7, 9, 19 • Bias cancellation

  9. Lloyd-Max quantizer

  10. Histogram of quantized contexts

  11. Histogram of error pixels neighboured by quantized contexts Entropy = 3.804 Real CodeLength = 3.876 Entropy = 3.999 Real CodeLength = 4.153

  12. Bias cancellation

  13. Histogram of corrected error pixels

  14. Context clustering • Definition: • Clustering on conditional proability density function (PDF) of error pixel in each context • Advantage: • Merging the contexts that have similar PDFs and reducing number of contexts used in entropy coding • Difficulties: • High dimensionality of PDF vector

  15. Definition of PDF Vector • Decomposition of error pixel: • Defnition of PDF Vectors: • ypdf (C)= (f, p) • f is the frequency of context C • p is the contiditional probability density function of each decomposed error pixel in context C

  16. Clustering in PDF vector space • Cluster Representatives of PDF Vectors: • Kullback-Leibler Distance: • Clustering Distortion Function: ,

  17. Adaptive arithmetic coding • A Example String to be Coded: • S = { aaabbaaccbbccbb} • Frequencies of character (Transmitted): • fa = 5, fb = 6 and fc = 4. • Adaptive Probabilities of Encoder: • { 5/15(a), 4/14(a), 3/13(a), 6/12(b), 5/11(b), 2/10(a), 1/9(a), 4/8(c), 3/7(c), 4/6(b), 3/5(b), 2/4(c), 1/3(c), 2/2(b), 1/1(b) } PROCEDURE AdaptiveCoder(S,fa,fb,fc) ga fa; gb fb; gc fc; N  fa+fb+fc; FOR i  1 TO NumberOfCharacters DO ArithmeticCoding(S(i),ga/N,gb/N,gc/N); IF S(i)= a THEN ga ga-1; ELSE IF char(i)= b THEN gb gb-1; ELSE gc gc-1; N  N-1;END PROCEDURE.

  18. Pseudocodes of the example coder PROCEDURE AdaptiveCoder(S,fa,fb,fc) ga fa; gb fb; gc fc; N  fa+fb+fc; FOR i  1 TO NumberOfCharacters DO ArithmeticCoding(S(i),ga/N,gb/N,gc/N); IF S(i)= a THEN ga ga-1; ELSE IF char(i)= b THEN gb gb-1; ELSE gc gc-1; N  N-1;END PROCEDURE.

  19. Test images Camera Bridge Missa001 Boats Missa002 Lena

  20. Comparisons of compression ratio Compression results of testing images in bits/pixel Fixed-size quantizer results are shown in brackets

  21. Conclusions • Context clustering is an effective solution for losslesscompression and solves the storage problem of the PDF vectors. • The variable-size quantizers have better peformance and the adaptive arithmetic codeachieves better compression rate

  22. Furhter Work to be Done • Estimation of the optimal division number d • Estimation of the optimal number of contextclusters • Application of vector quantization in context gradient space

More Related