1 / 51

Curve Sampling and Conditional Simulation

Curve Sampling and Conditional Simulation. Ayres Fan John W. Fisher III Alan S. Willsky MIT Stochastic Systems Group. Outline. Overview Curve evolution Markov chain Monte Carlo Curve sampling Conditional simulation. Overview.

falala
Download Presentation

Curve Sampling and Conditional Simulation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Curve Sampling and Conditional Simulation Ayres Fan John W. Fisher III Alan S. Willsky MIT Stochastic Systems Group

  2. Outline • Overview • Curve evolution • Markov chain Monte Carlo • Curve sampling • Conditional simulation

  3. Overview • Curve evolution attempts to find a curve C (or curves Ci) that best segment an image (according to some model) • Goal is to minimize an energy functional E(C) (view as a negative log likelihood) • Find a local minimum using gradient descent

  4. Sampling instead of optimization • Draw multiple samples from a probability distribution p (e.g., uniform, Gaussian) • Advantages: • Naturally handles multi-modal distributions • Avoid local minima • Higher-order statistics (e.g., variances) • Conditional simulation

  5. Outline • Overview • Curve evolution • Markov chain Monte Carlo • Curve sampling • Conditional simulation

  6. Planar curves • A curve is a function • We wish to minimize an energy functional with a data fidelity term and regularization term: • This results in a gradient flow: • We can write any flow in terms of the normal:

  7. Euclidean curve shortening flow • Let’s examine • E is minimized when C is short • Gradient flow should be in the direction that minimizes the curve length the fastest • Use Euler-Lagrange and we see:

  8. Level-Set Methods • A curve is a function (infinite dimensional) • A natural implementation approach is to use marker points on the boundary (snakes) • Reinitialization issues • Difficulty handling topological change • Level set methods instead evolve a surface (one dimension higher than our curve) whose zeroth level set is the curve (Sethian and Osher)

  9. Embedding the curve • Force level set  to be zero on the curve • Chain rule gives us

  10. References • Snakes: Active contour models, Kass, Witkin, Terzopoulos 1987 • Geodesic active contours, Caselles, Kimmel, Sapiro 1995 • Region competition, Zhu and Yuille, 1996 • Mumford-Shah approach, Tsai, Yezzi, Willsky 2001 • Active contours without edges, Chan and Vese, 2001

  11. Examples-I

  12. Examples-II

  13. Examples-III

  14. Examples-IV

  15. Examples-V

  16. Outline • Overview • Curve evolution • Markov chain Monte Carlo • Curve sampling • Conditional simulation

  17. General MAP Model • For segmentation: • x is a curve • y is the observed image (can be vector) • S is a shape model • Data model usually IID given the curve • We wish to sample from p(x|y;S), but cannot do so directly

  18. Markov Chain Monte Carlo • Class of sampling methods that iteratively generate candidates based on previous iterate (forming a Markov chain) • Examples include Gibbs sampling, Metropolis-Hastings • Instead of sampling from p(x|y;S), sample from a proposal distribution q and keep samples according to an acceptance rule a

  19. Metropolis algorithm • Metropolis algorithm: • Start with x0 • At time t, generate candidate t (given xt-1) • Set xt = t with probability min(1, p(t)/p(xt-1)), otherwise xt = xt-1 • Go back to 2

  20. Asymptotic Convergence • Two main requirements to ensure iterates converge to samples: • q spans support of p • This means that any element with non-zero probability must be able to be generated by a sequence of samples from q • Detailed balance is achieved

  21. Metropolis-Hastings • Metropolis update rule only results in detailed balance when q is symmetric: • When this is not true, we need to evaluate the Hastings ratio: and keep a candidate with probability min(1, r)

  22. Outline • Overview • Curve evolution • Markov chain Monte Carlo • Curve sampling • Conditional simulation

  23. Basic model • Modified Chan-Vese energy functional • We can view data model as being iid inside and outside the curve with different Gaussian distributions

  24. MCMC Curve Sampling • Generate perturbation on the curve: • Sample by adding smooth random fields: • S controls the degree of smoothness in field • Note for portions where f is negative, shocks may develop (so called prairie fire model) • Implement using white noise and circular convolution

  25. Coverage/Detailed balance • It is easy to show we can go from any curve C1 to any other curve C2 (shrink to a point) • For detailed balance, we need to compute probability of generating C´ from C (and vice versa) • If we assume that the normals are approximately the same, then we see f = f ´ (symmetric)

  26. Initial results

  27. Smoothness issues • While detailed balance assures asymptotic convergence, may need to wait a very long time • In this case, smooth curves have non-zero probability under q, but are very unlikely to occur • Can view accumulation of perturbations as(h is the smoothing kernel, ni is white noise) • Solution: make q more likely to move towards high-probability regions of p

  28. Adding mean force • We can add deterministic elements to f (i.e., a mean to q) • The average behavior should then be to move towards higher-probability areas of p • In the limit, setting f to be the gradient flow of the energy functional results in always accepting the perturbation • We keep the random field but have a non-zero mean:

  29. Detailed balance redux • Before we just assumed that q was symmetric and used Metropolis updates • Now we need to evaluate q and use Metropolis-Hastings updates • Probability of going from C to C´ is the probability of generating f (which is Gaussian) and the reverse is the probability of f ´ (also Gaussian)

  30. Detailed balance continued • Relationship between f and f ´ complicated due to the fact that the normal function changes • Various levels of exactness • Assume • Infinitesimal approximation (ignore tangential) • Trace along (technical issues)

  31. New results

  32. Further research • We would like q to naturally generate smooth curves • Different structure for the perturbation • Speed always an issue for MCMC approaches • Multiresolution perturbations • Parameterized perturbations (efficient basis) • Hierarchical models • How to properly use samples?

  33. Outline • Overview • Curve evolution • Markov chain Monte Carlo • Curve sampling • Conditional simulation

  34. User Information • In many problems, the model admits many reasonable solutions • Currently user input largely limited to initialization • We can use user information to reduce the number of reasonable solutions • Regions of inclusion or exclusion • Partial segmentations • Can help with both convergence speed and accuracy • Interactive segmentation

  35. Conditional simulation • With conditional simulation, we are given the values on a subset of the variables • We then wish to generate sample paths that fill in the remainder of the variables (e.g., simulating Brownian motion)

  36. Simulating curves • Say we are given Cs, a subset of C (with some uncertainty associated with it) • We wish to sample the unknown part of the curve Cu • One way to view is as sampling from: • Difficulty is being able to evaluate the middle term as theoretically need to integrate p(C)

  37. Simplifying Cases Under special cases, evaluation of is tractable: • When C is low-dimensional (can do analytical integration or Monte-Carlo integration) • When Cs is assumed to be exact • When p(C) has special form (e.g., independent) • When willing to approximate

  38. Basic model in 3D • Energy functional with surface area regularization: • With a slice-based model, we can write the regularization term as:

  39. Zero-order hold approximation • Approximate volume as piecewise-constant “cylinders”: • Then we see that the surface areas are: • We see terms related to the curve length and the difference between neighboring slices • Upper bound to correct surface area

  40. self potentials edge potentials Overall regularization term • Adding everything together results in:

  41. 2.5D Approach • In 3D world, natural (or built-in) partition of volumes into slices • Assume Markov relationship among slices • Then have local potentials (e.g., PCA) and edge potentials (coupling between slices) • Naturally lends itself to local Metropolis-Hastings approach (iterating over the slices)

  42. 2.5D Model • We can model this as a simple chain structure with pairwise interactions • This admits the following factorization:

  43. Partial segmentations • Assume that we are given segmentations of every other slice • We now want to sample surfaces conditioned on the fact that certain slices are fixed • Markovianity tells us that c2 and c4 are independent conditioned on c3

  44. Log probability for c2 • We can then construct the probability for c2 conditioned on its neighbors using the potential functions defined previously:

  45. Results Neighbor segmentations Our result (cyan) with expert (green)

  46. Larger spacing • Apply local Metropolis-Hastings algorithm where we sample on a slice-by-slice basis • Theory shows that asymptotic convergence is unchanged • Unfortunately larger spacing similar to less regularization • Currently have issues with poor data models that need to be resolved

  47. Full 3D • In medical imaging, have multiple slice orientations (axial, sagittal, coronal) • In seismic, vertical and horizontal shape structure expected • With a 2.5D approach, this introduces complexity to the graph structure

  48. Incorporating perpendicular slices • c is now coupled to all of the horizontal slices • c only gives information on a subset of each slice • Specify edge potentials as, e.g.:

  49. Other extensions • Additional features can be added to the base model: • Uncertainty on the expert segmentations • Shape models (semi-local) • Exclusion/inclusion regions • Topological change (through level sets)

  50. Applications • Currently looking for applications to demonstrate utility of curve sampling • Multi-modal distributions • Morphology changes • Confidence levels

More Related