1 / 31

Monte Carlo Integration

Monte Carlo Integration. Robert Lin April 20, 2004. Outline. Integration Applications Random variables, probability, expected value, variance Integration Approximation Monte Carlo Integration Variance Reduction (sampling methods). Integration Applications. Antialiasing.

albert
Download Presentation

Monte Carlo Integration

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monte Carlo Integration Robert Lin April 20, 2004

  2. Outline • Integration Applications • Random variables, probability, expected value, variance • Integration Approximation • Monte Carlo Integration • Variance Reduction (sampling methods)

  3. Integration Applications • Antialiasing

  4. Integration Applications • Soft Shadows

  5. Integration Applications • Indirect Lighting

  6. Random Variables, Probability Density Function • Continuous random variable x: scalar or vector quantity that randomly takes on a value (-∞,+∞) • Probability Density Function p associated with x (denoted x ~ p) describes the distribution of x: • Properties:

  7. Random Variables, Probability Density Function • Example: Let ε be a random variable taking on values [0, 1) uniformly • Probability Density Function ε~ q • Probability that ε takes on a certain value [a, b] in [0, 1) is

  8. Expected Value • The average value of a function f(x) with probability distribution function (pdf) p(x) is called the expected value: • The expected value of a 1D random variable can be calculated by letting f(x) = x. • Expected Value Properties: 1. 2.

  9. Multidimensionality • Random variables and expected values can be extended to multiple dimensions easily • Let S represent a multidimensional space with measure μ • Let x be a random variable with pdf p • Probability that x takes on a value in region in Si, a subset of S, is

  10. Multidimensionality • Example: • Let α be a 2D random variable uniformly distributed on a disk of radius R • p(α) = 1 / (πR2)

  11. Multidimensionality • Example • Given a unit square S = [0, 1] x [0, 1] • Given pdf p(x, y) = 4xy • The expected value of the x coordinate is found by setting f(x, y) = x:

  12. Variance • The variance of a random variable is defined as the expected value of the square of the difference between x and E(x). • Some algebra lets us convert this to the form:

  13. Integration Problems • Integrals for rendering can be difficult to evaluate • Multi-dimensional integrals • Non-continuous functions • Highlights • Occluders

  14. Integration Approximation • How to evaluate integral of f(x)?

  15. Integration Approximation • Can approximate using another function g(x)

  16. Integration Approximation • Can approximate by taking the average value

  17. Integration Approximation • Estimate the average by taking N samples

  18. Monte Carlo Integration • Im = Monte Carlo estimate • N = number of samples • x1, x2, …, xN are uniformly distributed random numbers between a and b

  19. Monte Carlo Integration

  20. Monte Carlo Integration • We have the definition of expected value and how to estimate it. • Since the expected value can be expressed as an integral, the integral is also approximated by the sum. • To simplify the integral, we can substitute g(x) = f(x)p(x).

  21. Variance • The variance describes how much the sampled values vary from each other. • Variance proportional to 1/N

  22. Variance • Standard Deviation is just the square root of the variance • Standard Deviation proportional to 1 / sqrt(N) • Need 4X samples to halve the error

  23. Variance • Problem: • Variance (noise) decreases slowly • Using more samples only removes a small amount of noise

  24. Variance Reduction • There are several ways to reduce the variance • Importance Sampling • Stratified Sampling • Quasi-random Sampling • Metropolis Random Mutations

  25. Importance Sampling • Idea: use more samples in important regions of the function • If function is high in small areas, use more samples there

  26. Importance Sampling • Want g/p to have low variance • Choose a good function p similar to g:

  27. Stratified Sampling • Partition S into smaller domains Si • Evaluate integral as sum of integrals over Si • Example: jittering for pixel sampling • Often works much better than importance sampling in practice

  28. Examples

  29. Examples

  30. Conclusion • Monte Carlo Integration Pros • Good to estimate integrals with many dimensions • Good to estimate integrals with complex functions • General integration method with many applications • Monte Carlo Integration Cons • Variance reduces slowly (error appears as noise) • Reduce variance with importance sampling, stratified sampling, etc. • Can use other methods (filtering) to remove noise

  31. References • Peter Shirley, R. Keith Morley. Realistic Ray Tracing, Natick, MA: A K Peters, Ltd., 2003, pages 47-51, 145-154. • Henrik Wann Jensen. Realistic Image Synthesis Using Photon Mapping, Natick, MA: A K Peters, Ltd., 2001, pages 153-155. • Pat Hanrahan. Monte Carlo Integration 1 (Lecture Notes): http://graphics.stanford.edu/courses/cs348b-02/lectures/lecture6 • Thomas Funkhouser, Monte Carlo Integration For Image Synthesis: http://www.cs.princeton.edu/courses/archive/fall02/cs526/lectures/montecarlo.pdf • Eric Veach. Robust Monte Carlo Methods for Light Transport Simulation. Ph.D Thesis, Stanford University, Dec 1997.

More Related