1 / 39

From PDEs to Information Science and Back

From PDEs to Information Science and Back. Russel Caflisch IPAM Mathematics Department, UCLA. Collaborators & Support. UCLA Stan Osher Hayden Schaeffer Oak Ridge National Labs Cory Hauck. Themes.

steffi
Download Presentation

From PDEs to Information Science and Back

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From PDEs to Information Science and Back Russel Caflisch IPAM Mathematics Department, UCLA 2013 SIAM Great Lakes Section

  2. Collaborators & Support • UCLA Stan Osher Hayden Schaeffer • Oak Ridge National Labs Cory Hauck 2013 SIAM Great Lakes Section

  3. Themes • Over the last 20 years, there has been a lively influx of ideas and techniques from analysis and PDEs into information science • e.g., image processing • Rapid progress in information science has produced wonderful mathematics • e.g., compressed sensing • Ideas and techniques from info science are starting to be used for PDEs 2013 SIAM Great Lakes Section

  4. Information Science From Wikipedia Information science (or information studies) is an interdisciplinary field primarily concerned with the analysis, collection, classification, manipulation, storage, retrieval, movement, and dissemination of information. This talk will focus on the analysis and manipulation of data in the form of images and signals. 2013 SIAM Great Lakes Section

  5. Image Denoising 2013 SIAM Great Lakes Section

  6. Image Denoising • Removing noise from an image • Example • From “Image Processing and Analysis: Variational, PDE, Wavelet and Stochastic Methods” by T. Chan and J. Shen (2005) 2013 SIAM Great Lakes Section

  7. Denoising by Weiner Filter • Noise is random or has rapid oscillations • So it can be canceled by local averaging • Describe the original noisy image as a function • x is position and u is gray scale. • The Wiener filter transforms u to 2013 SIAM Great Lakes Section

  8. Weiner Filter as a PDE • The Wiener filter transforms u to • is the fundamental solution for the heat equation, i.e., • Therefore in which 2013 SIAM Great Lakes Section

  9. Denoising by Rudin-Osher PDE • Variational principle for noise removal • L. Rudin, S. Osher and E. Fatemi (1992) • For noisy image u0, the denoised image u minimizes • Gradient descent is the nonlinear parabolic PDE 2013 SIAM Great Lakes Section

  10. Significance of Rudin-Osher • The Rudin-Osher variational principle • λ is a Lagrange multiplier • u minimizes , for constant value of • measures total variation (TV) of u • TV used for nonlinear hyperbolic PDEs • Promotes steep gradients, as in shock waves and edges • Edges are dominant feature of images 2013 SIAM Great Lakes Section

  11. Comparison of Rudin-Osher to Wiener • Rudin-Osher variational principle • L2 alternative leads to heat equation (with lower order terms), almost the same as the PDE for Wiener filtering • Rather than promoting edges like Rudin-Osher, • Wiener filtering smooths gradients 2013 SIAM Great Lakes Section

  12. Results for Rudin-Osher vs. Wiener Rudin, Osher, Fatemi (1992) 2013 SIAM Great Lakes Section

  13. Extensions of the Variational Approach to Imaging Applications • Segmentation • Inpainting • Texture 2013 SIAM Great Lakes Section

  14. Image Segmentation • Find boundaries Γ of objects in image region Ω • Active contour model: given image u, Γ minimizes • = average of u inside each component of Γ • T. Chan and L. Vese (2001) • Earlier variational principle of Mumford-Shah 2013 SIAM Great Lakes Section

  15. Image segmentation by Active Contour Model Chan, Vese (2001) 2013 SIAM Great Lakes Section

  16. Image Inpainting • Extend image to region where info is missing • TV inpainting model: given image u0 and region D, inpainted image u minimizes • Information in D found by continuing in from boundary Γ • T. Chan and J. Shen (2002) 2013 SIAM Great Lakes Section

  17. TV Impainting Chan, Shen (2005) 2013 SIAM Great Lakes Section

  18. Texture • Texture is regular, oscillatory features in image • Y. Meyer (2001), texture should average to 0, so that it belongs in the dual of BV • Image model: • With u = regular component, including contours v = oscillatory texture component w = oscillatory, unattractive noise component • Variational principle: For image u0, chose u, g to minimize 2013 SIAM Great Lakes Section

  19. Example of Texture Bertalmio, Vese, Sapiro, Osher (2003) 2013 SIAM Great Lakes Section

  20. Inpainting of Texture Bertalmio, Vese, Sapiro, Osher (2003) 2013 SIAM Great Lakes Section

  21. New Methods in Information Science • Wavelets • Sparsity and Compressed Sensing 2013 SIAM Great Lakes Section

  22. Wavelets • Wavelets • An orthonormal basis • Indexed by position and scale (~ wavenumber) • Based on translation and scaling of a single function • Easy forward and inverse transforms • Localized in both x and k • Invention • Wavelet transform developed: Morlet 1981 • Nontrivial wavelet basis:Yves Meyer 1986 • Compact and smooth wavelets: Daubechies 1988 2013 SIAM Great Lakes Section

  23. Sparsity • Sparsity in datasets (e.g., sensor signals) • Signal which is “m-sparse”, with • i.e., x has at most m non-zero components • n measurements of x, corresponds to • Objectives • How many measurements are required? • What is the value of n? • How hard is it to compute x? • Tractable or intractable? 2013 SIAM Great Lakes Section

  24. Compressed Sensing • Compressed sensing 2006 • David Donoho • Emmanuel Candes, Justin Romberg & Terry Tao 2013 SIAM Great Lakes Section

  25. Compressed Sensing • Problem statement • Find x that is m-sparse and solves Ax = f • Assuming that an m-sparse solution exists • Standard methods min subject to constraint Ax = f • note • Compressed sensing min subject to constraint Ax = f • note 2013 SIAM Great Lakes Section

  26. How many measurements are required? • For m << N, find m-sparse solution of • Standard methods require: n = N • #(equations)=#(unknowns) • Compressed sensing: n = m (log N) • n << N. Many fewer equations than unknowns! • Solution is exact with high probability! • Reduced isometry property (RIP) • convex programming 2013 SIAM Great Lakes Section

  27. How hard is it to compute x? • Standard methods: NP hard = intractable • Compressed sensing: tractable and fast • convex programming 2013 SIAM Great Lakes Section

  28. Why Does L1 Promote Sparsity? • Compressed sensing min subject to constraint Ax = f • Two simplified problems: Find x in R2 solving • 1. min subject to constraint • 2. min for given y in R2 2013 SIAM Great Lakes Section

  29. Version 1: Geometric solution • Find x on line with smallest • For all but 45° lines, L1 norm is smallest at a vertex. • Vertices are sparse points, since a component is 0. • Works in higher dimension 2013 SIAM Great Lakes Section

  30. Version 2: Analytic Solution • Given y, find x that minimizes • Minimum x has each component xi minimizing • Exercise: Show that the minimum is • Operator Sλ is “soft-thresholding” 2013 SIAM Great Lakes Section

  31. Soft Thresholding x=Sλy y 2013 SIAM Great Lakes Section

  32. Applications of Information Science to PDEs • Wavelets for turbulence • Sparsity for PDEs 2013 SIAM Great Lakes Section

  33. Wavelets for Turbulence • Turbulent solutions of the incompressible Navier-Stokes equations • Marie Farge and co-workers • Transformed velocity into wavelet basis • Deleted wavelet components with small coefficients to get “coherent part”. • In 2D 2562 computation, 0.7% of wavelet coefficients retain 99.2% of energy and 94% of enstrophy. Farge, et al. 1999 • In 3D 2563 computation, 3% of wavelet coefficients retain 99% of energy and 75% of enstrophy. Okamoto, et al. 2007. 2013 SIAM Great Lakes Section

  34. Vorticity in 2D Turbulence Total field Coherent part Incoherent part Farge, Schneider & Kevlahan 1999 2013 SIAM Great Lakes Section

  35. Vorticity in 3D Turbulence Okamoto, Yoshimatsu, Schneider, Farge, Kaneda 2007 2013 SIAM Great Lakes Section

  36. Sparsity for PDEs • PDE • Schaeffer, Osher, Caflisch & Hauck, 2013 • Apply soft-thresholding Sλ to Fourier coefficients • λ = c Δt2 • Alternatives to Fourier (e.g., framelets) now being used • Promotes sparsity • How should soft-thresholding be used? • Discretization in time, un= u(tn= n Δt) F= Fourier transform 2013 SIAM Great Lakes Section

  37. Examples for Sparse PDEs Solver • Schaeffer, Osher, Caflisch & Hauck, 2013 • Examples • Convection eqtn with rapidly varying coefficients • Parabolic eqtn with rapidly varying coefficients • Viscous Burgers eqtn with rapidly varying convection term • 2D Navier-Stokes vorticity equation, with rapidly oscillatory forcing • f is rapidly oscillating in x, constant in t 2013 SIAM Great Lakes Section

  38. Sparse Solution of 2D Navier Stokes for Interacting Vortices with Oscillatory Forcing Schaeffer, Osher, Caflisch & Hauck, 2013 2013 SIAM Great Lakes Section

  39. Possible Future Directions • Texture in solutions of PDEs • Proposed model for “incoherent component of vorticity” by Farge • Combination of network-based models, data-driven models and continuum models. • Empirical mode decomposition (EMD) • Norden Huang, Tom Hou, Nathan Kutz • Machine learning for many applications • Klaus Muller for materials • Many possibilities! 2013 SIAM Great Lakes Section

More Related