1 / 28

ASAC : A utomatic S ensitivity Analysis for A pproximate C omputing

ASAC : A utomatic S ensitivity Analysis for A pproximate C omputing. Pooja ROY , Rajarshi RAY, Chundong WANG, Weng Fai WONG National University of Singapore. LCTES 2014. Why Approximate?. Why Approximate?. Quality of Service. High QoS. Relaxed accuracy. QOS Band.

linus
Download Presentation

ASAC : A utomatic S ensitivity Analysis for A pproximate C omputing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASAC : Automatic Sensitivity Analysis for Approximate Computing Pooja ROY, Rajarshi RAY, Chundong WANG, Weng Fai WONG National University of Singapore LCTES 2014

  2. Why Approximate? ASAC : Automatic Sensitivity Analysis for Approximate Computing

  3. Why Approximate? Quality of Service High QoS Relaxed accuracy QOS Band Acceptable QoS ASAC : Automatic Sensitivity Analysis for Approximate Computing 3

  4. Exploring Previous Works Algorithm Ansel et.al. (CGO’11) Programming (API) Carbin et.al. (OOPSLA’13) Sampson et.al. (PLDI’11) Compilation Carbin et.al. (ISSTA’10) Misailovic et.al. (TECS’13) Zhu et.al. (POPL’12) Baek et.al. (PLDI’10) Sidiroglou-Douskos et.al. (FSE’11) Architecture Esmaeilzadeh et.al. (ASPLOS’12) Chippa et.al. (DAC’10) Hoffman et.al. (ASPLOS’11) Sampsin et.al. (MICRO’13) Circuit Gupta et.al. (ISLPED’11) Venkataramani et.al. (MICRO’13) Kahng et.al. (DAC’12) ASAC : Automatic Sensitivity Analysis for Approximate Computing

  5. Exploring Previous Works Algorithm Ansel et.al. (CGO’11) Programming (API) Carbin et.al. (OOPSLA’13) Sampson et.al. (PLDI’11) Compilation Carbin et.al. (ISSTA’10) Misailovic et.al. (TECS’13) Zhu et.al. (POPL’12) Baek et.al. (PLDI’10) Sidiroglou-Douskos et.al. (FSE’11) Architecture Esmaeilzadeh et.al. (ASPLOS’12) Chippa et.al. (DAC’10) Hoffman et.al. (ASPLOS’11) Sampsin et.al. (MICRO’13) Circuit Gupta et.al. (ISLPED’11) Venkataramani et.al. (MICRO’13) Kahng et.al. (DAC’12) ASAC : Automatic Sensitivity Analysis for Approximate Computing

  6. Approximation based Programming Paradigm Approximable data Compilation framework to support approximation Code Non-approximable data ASAC : Automatic Sensitivity Analysis for Approximate Computing Newprogramming paradigm Explicit classification of program data (variables, methods etc.)

  7. Need of Automation Original Code Rewrite using new language constructs • Programmer’s Annotation, • Provision of multiple versions Approximable data Compilation framework to support approximation Code Non-approximable data ASAC : Automatic Sensitivity Analysis for Approximate Computing

  8. Need of Automation ASAC : Automatic Sensitivity Analysis for Approximate Computing Writing ‘binutils’ from scratch? Expect app developers to provide many versions? Recompile and test ‘Picassa’, ‘VLC,’ with multiple QoS requirements? Providing for entire android/ios kernels?

  9. Our Approach : ASAC ASAC : Automatic Sensitivity Analysis for Approximate Computing • Automatic Sensitivity Analysis • Statistical perturbation based framework • Scalable • Specifically, considers internal program data for approximation

  10. Key Idea Perturb each Variable Code Perturbed Output Acceptable QoS sensitivity ASAC : Automatic Sensitivity Analysis for Approximate Computing Sensitivity particular variables’ contribution towards the output Based on ‘sensitivity’ the variables are ranked Low ranked variables can be approximated Higher ranked variables are critical

  11. Key Idea Perturb each Variable Code Perturbed Output Acceptable QoS sensitivity ASAC : Automatic Sensitivity Analysis for Approximate Computing How to systematically perturb the variables? How to translate the perturbed output to sensitivity ranking?

  12. Hyperbox Sampling i int sum(){ inti; double a = 0.1, sum = 0.0; for(i=0;i<10;i++){ sum += a/10; } return sum; } a sum Creating hyperbox with value range of each variable ASAC : Automatic Sensitivity Analysis for Approximate Computing

  13. Hyperbox Sampling i int sum(){ inti; double a = 0.1, sum = 0.0; for(i=0;i<10;i++){ sum += a/10; } return sum; } a sum Discretizing each dimension by ‘k’ ASAC : Automatic Sensitivity Analysis for Approximate Computing

  14. Hyperbox Sampling i int sum(){ inti; double a = 0.1, sum = 0.0; for(i=0;i<10;i++){ sum += a/10; } return sum; } a sum Choosing samples based on “Latin Hyperbox Sampling” ASAC : Automatic Sensitivity Analysis for Approximate Computing

  15. Hyperbox Sampling i int sum(){ inti; double a = 0.1, sum = 0.0; for(i=0;i<10;i++){ sum += a/10; } return sum; } a sum 0.2 3 0.7 Controlled perturbation ASAC : Automatic Sensitivity Analysis for Approximate Computing

  16. Perturbed Outputs (k1) n1 i=0 Not trivial! ASAC : Automatic Sensitivity Analysis for Approximate Computing • Rule 1 • For a program with ‘n’ variables, discretization constant ‘k’ and ‘m’ randomly chosen points , number of perturbed outputs are - m* (  (k-i))

  17. Key Idea Perturb each Variable Code Perturbed Output Acceptable QoS sensitivity ASAC : Automatic Sensitivity Analysis for Approximate Computing How to systematically perturb the variables?  How to translate the perturbed output to sensitivity ranking?

  18. Perturbed Outputs (cumulative distribution function (cdf) for each variable) ASAC : Automatic Sensitivity Analysis for Approximate Computing ‘good’ sample – within QoS band ‘bad’ sample – outlies the QoS band

  19. Hypothesis Testing ASAC : Automatic Sensitivity Analysis for Approximate Computing • Kolmogorov-Smirnov test calculates the max distance between the curves • Rule 2 • The maximum distance between the curves is the sensitivity score for the variable. Higher the score, the more the variable contributes towards the program output.

  20. Approximable vs. Critical ASAC : Automatic Sensitivity Analysis for Approximate Computing • Sensitivity score ( > 0.5) is critical • For evaluation • Mild Error Injection : 1/3 (or 1/2) of approximable variables • Medium Error Injection : 1/6 of approximable variables • Aggressive Error Injection : All of the approximable variables • Programs • SciMark2 • MiBench (JPEG) • SPEC2006 (464.H264ref)

  21. ASAC Correctness ASAC : Automatic Sensitivity Analysis for Approximate Computing

  22. ASAC Correctness *as compared to ‘manually annotated baseline’ (EnerJ, PLDI’11) ASAC : Automatic Sensitivity Analysis for Approximate Computing

  23. ASAC : JPEG Input Encode (Mild) Decode (Mild) Encode (Aggressive) Decode (Aggressive) ASAC : Automatic Sensitivity Analysis for Approximate Computing

  24. ASAC : H264 ASAC : Automatic Sensitivity Analysis for Approximate Computing

  25. ASAC Runtime ASAC : Automatic Sensitivity Analysis for Approximate Computing

  26. ASAC Sanity Check • JPEG : Encode and Decode with error injected in variables marked as ‘non-approximable’ • H264 – Application crash ASAC : Automatic Sensitivity Analysis for Approximate Computing

  27. Concluding ASAC : Automatic Sensitivity Analysis for Approximate Computing • ASAC • Automatic classification of approximable and non-approximable data • Scalable • No profiling • Can be applied to program without available source code • Approximation • Saves energy and without performance loss

  28. Thank you ASAC : Automatic Sensitivity Analysis for Approximate Computing

More Related