260 likes | 427 Views
Advanced Computer Graphics Antialiasing. David Luebke cs551dl@cs.virginia.edu http://www.cs.virginia.edu/~cs551dl. Administrivia. Assignment 1 sample scenes. Recap. Prefiltering Before sampling the image, use a low-pass filter to eliminate frequencies above the Nyquist limit
E N D
Advanced Computer GraphicsAntialiasing David Luebke cs551dl@cs.virginia.edu http://www.cs.virginia.edu/~cs551dl David Luebke 7/31/2014
Administrivia • Assignment 1 sample scenes David Luebke 7/31/2014
Recap • Prefiltering • Before sampling the image, use a low-pass filter to eliminate frequencies above the Nyquist limit • This blurs the image… • But ensures that no high frequencies will be misrepresented as low frequencies David Luebke 7/31/2014
Recap • Supersampling • Sample image at higher resolution than final image, then “average down” • “Average down” means multiply by low-pass function in frequency domain • Which means convolving by that function’s FT in space domain • Which equates to a weighted average of nearby samples at each pixel David Luebke 7/31/2014
Recap • Supersampling cons • Doesn’t eliminate aliasing, just shifts the Nyquist limit higher • Can’t fix some scenes (e.g., checkerboard) • Badly inflates storage requirements • Supersampling pros • Relatively easy • Often works all right in practice • Can be added to a standard renderer David Luebke 7/31/2014
Antialiasing in the Continuous Domain • Problem with prefiltering: • Sampling and image generation inextricably linked in most renderers • Z-buffer algorithm • Ray tracing • Why? • Still, some approaches try to approximate effect of convolution in the continuous domain David Luebke 7/31/2014
Pixel Grid Filter kernel Polygons Antialiasing in the Continuous Domain David Luebke 7/31/2014
Antialiasing in the Continuous Domain • The good news • Exact polygon coverage of the filter kernel can be evaluated • What does this entail? • Clipping • Hidden surface determination David Luebke 7/31/2014
Antialiasing in the Continuous Domain • The bad news • Evaluating coverage is very expensive • The intensity variation is too complex to integrate over the area of the filter • Q: Why does intensity make it harder? • A: Because polygons might not be flat- shaded • Q: How bad a problem is this? • A: Intensity varies slowly within a pixel, so shape changes are more important David Luebke 7/31/2014
Catmull’s Algorithm • Find fragment areas • Multiply by fragment colors • Sum for final pixel color A2 A1 AB A3 David Luebke 7/31/2014
Catmull’s Algorithm • First real attempt to filter in continuous domain • Very expensive • Clipping polygons to fragments • Sorting polygon fragments by depth (What’s wrong with this as a hidden surface algorithm?) • Equates to box filter (Is that good?) David Luebke 7/31/2014
The A-Buffer • Idea: approximate continuous filtering by subpixel sampling • Summing areas now becomes simple David Luebke 7/31/2014
The A-Buffer • Advantages: • Incorporating into scanline renderer reduces storage costs dramatically • Processing per pixel depends only on number of visible fragments • Can be implemented efficiently using bitwise logical ops on subpixel masks David Luebke 7/31/2014
The A-Buffer • Disadvantages • Still basically a supersampling algorithm • Not a hardware-friendly algorithm • Lists of potentially visible polygons can grow without limit • Work per-pixel non-deterministic David Luebke 7/31/2014
The A-Buffer • Comments • Book claims this is most common algorithm for high-quality rendering • I’m not so sure, anymore • Book gives much gory detail • I won’t test you on it David Luebke 7/31/2014
Stochastic Sampling • Sampling theory tells us that with a regular sampling grid, frequencies higher than the Nyquist limit will alias • Q: What about irregular sampling? • A: High frequencies appear as noise, not aliases • This turns out to bother our visual system less! David Luebke 7/31/2014
Stochastic Sampling • An intuitive argument: • In stochastic sampling, every region of the image has a finite probability of being sampled • Thus small features that fall between uniform sample points tend to be detected by non-uniform samples David Luebke 7/31/2014
Stochastic Sampling • Integrating with different renderers: • Ray tracing: • It is just as easy to fire a ray one direction as another • Z-buffer: hard, but possible • Notable example: REYES system (?) • Using Image jittering is easier (more later) • A-buffer: nope • Totally built around square pixel filter and primitive-to-sample coherence David Luebke 7/31/2014
Stochastic Sampling • Idea: randomizing distribution of samples scatters aliases into noise • Problem: what type of random distribution to adopt? • Reason: type of randomness used affects spectral characteristics of noise into which high frequencies are converted David Luebke 7/31/2014
Stochastic Sampling • Problem: given a pixel, how to distribute points (samples) within it? David Luebke 7/31/2014
Stochastic Sampling • Poisson distribution: • Completely random • Add points at random until area is full. • Uniform distribution: some neighboring samples close together, some distant David Luebke 7/31/2014
Stochastic Sampling • Poisson disc distribution: • Poisson distribution, with minimum-distance constraint between samples • Add points at random, removing again if they are too close to any previous points • Very even-looking distribution David Luebke 7/31/2014
Stochastic Sampling • Jittered distribution • Start with regular grid of samples • Perturb each sample slightly in a random direction • More “clumpy” or granular in appearance David Luebke 7/31/2014
Stochastic Sampling • Spectral characteristics of these distributions: • Poisson: completely uniform (white noise). High and low frequencies equally present • Poisson disc: Pulse at origin (DC component of image), surrounded by empty ring (no low frequencies), surrounded by white noise • Jitter: Approximates Poisson disc spectrum, but with a smaller empty disc. David Luebke 7/31/2014
Stochastic Sampling • Watt & Watt, p. 134 David Luebke 7/31/2014