1 / 19

Blockyness Reduction By Brute Force Matrix Alteration

Blockyness Reduction By Brute Force Matrix Alteration. Bryan Berns & Dirk Werschmoeller. JPEG - History & Importance. JPEG group was formed in 1985 JPEG Compression became standard in the mid-90’s Most widely used format for internet web content

Download Presentation

Blockyness Reduction By Brute Force Matrix Alteration

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Blockyness Reduction By Brute Force Matrix Alteration Bryan Berns & Dirk Werschmoeller

  2. JPEG - History & Importance • JPEG group was formed in 1985 • JPEG Compression became standard in the mid-90’s • Most widely used format for internet web content • Based on biological information biasing of human eye

  3. The JPEG Algorithm • Uses idea that only a few main frequencies contributed to an image’s energy • Breaks down an image to 8 x 8 blocks and its frequency components: - Eliminates unimportant values via a quantization matrix - Encodes these values (DCT coefficients) via Huffman run-length coding

  4. Blockyness • These 8 x 8 individually processed sub-images: • adequately sized for both visual ambiguity and compression ratio - lead to a ‘blocky’ effect if certain image frequencies are discarded

  5. A Brute Force Approach • The standard quantization matrix (Qs) generated to work best on average • However Qs is most likely not optimal for a given image • Brute Force Method generates a new QN which is better adapted to a certain image • Expendable computational power makes Brute Force practical • Process is transparent to the decoder if matrix is transferred.

  6. Brute Force Pseudo-code of a Brute Force method: QMatrix = Quality(x) // Predefined Matrix Repeat { Test_Matrix = RandomChange(QMatrix); Error = EvalError(Test_Matrix,Image); Size = EvalSize(Test_Matrix,Image); Blocky = EvalBlocky(Test_Matrix,Image); // condition // If Size and Error and Blocky < Previous_Values QMatrix = Test_Matrix; If <Convergence Test> Break }

  7. Why Three Constraints? • If size is not bounded, our approach has the possibility of preserving all frequencies (Quality = 100) • If error is not limited, our program has the opportunity to discard all frequency components to obtain zero blockyness

  8. Blockyness Estimation • Very sufficient approach to estimate blockyness of an image: • Computationally efficient • The ideal original image would have a blockyness estimation of ‘1’

  9. Size Estimation • Actual JPEG uses run-length Huffman encoding • Only the relative size is important, so Entropy equation is adequate: • Computationally more efficient than using actual encoding for estimation

  10. Error Estimation • Evaluation of relative error regarding to the original one • Computationally more efficient than mean-squared error

  11. Methods To Perform Changes • Randomly generated sub-’matrix is added to the original one • Evaluating the efficiency of brute force by: -Varying Matrix size -Changing Random Number distribution -Application to a set of images -Comparing with predefining JPEG algorithm

  12. Varying Matrix size

  13. Varying Sub Matrix Generator

  14. Reverse Engineering • In average the changes contributing to the decrease of blockyness are marked by the colored dots • Red dots = more important frequencies for image quality

  15. Application To A Set Of Images

  16. QN Applied To ‘Lena’ • To show value of any quantization major, we must approve it with Lena (who wasn’t in the training group) >> imgcomp('test/lena.png', QN) Estimated Size of Q50: 321458.68918628 Estimated Size of QN : 329104.90241524 Estimated Blockiness of Q50: 1.43134678 Estimated Blockiness of QN : 1.40737543 Error of Q50: 783662.34465455 Error of QN : 767224.82010086

  17. Final Comparison • After many iterations, one can see a reduction in blockyness. Compressed with JPEG Standard (Q = 50) Compressed with QN • Image (macroscopically) blends upwards • Using the optimized quantization matrix helps account for this and reduces blockyness

  18. Summary • Process to decrease the blocky condition (and error, in general) in JPEG-processed files is described • Generated QN also generally improves image compressing quality • Reduction of perceptional errors is proportional to number of program iterations

  19. References Websites: JPEG, http://www.jpeg.org Image Compression, http://www.dspexperts.com/dsp/projects/dctcomp JPEG Image Compression http://www.vectorsite.net/ttdcmp2.html JPEG Compression http://netghost.narod.ru/gff/graphics/book/ch09_06.htm An Introduction to JPEG Encoding http://www.online.redwoods.cc.ca.us/instruct/darnold Datacompression http://www.datacompression.info NASA Vision Group http://vision.arc.nasa.gov/ Books: Digital Image Processing, second edition Gonzales, R.C. ; Woods, R.E.

More Related