1 / 31

EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis

EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis. Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University http://www.smapip.is.tohoku.ac.jp/~kazu/. C ollaborator s: D. M. Titterington (Department of Statistics, University of Glasgow).

cole-ware
Download Presentation

EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EM Algorithm withMarkov Chain Monte Carlo Method forBayesian Image Analysis Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University http://www.smapip.is.tohoku.ac.jp/~kazu/ Collaborators: D. M. Titterington (Department of Statistics, University of Glasgow) University of Glasgow

  2. Contents Introduction Gaussian Graphical Model and EM Algorithm Markov Chain Monte Carlo Method Concluding Remarks University of Glasgow

  3. Contents Introduction Gaussian Graphical Model and EM Algorithm Markov Chain Monte Carlo Method Concluding Remarks University of Glasgow

  4. MRF and Statistical Inference • Geman and Geman (1986): IEEE Transactions on PAMI • Image Processing for Markov Random Fields (MRF) (Simulated Annealing, Line Fields) How can we estimate hyperparameters in the degradation process and in the prior model only from observed data? • EM Algorithm In the EM algorithm, we have to calculate some statistical quantities in the posterior and the prior models. • Belief Propagation • Markov Chain Monte Carlo Method University of Glasgow

  5. Statistical Analysis in EM Algorithm • K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: • J. Phys. A 2004 • Hyperparameter Estimation by using Belief Propagation (BP) for Gaussian Graphical Model in Image Processing • J. Inoue and K. Tanaka: Phys. Rev. E 2002, J. Phys. A 2003 • Statistical Behaviour of EM Algorithm for MRF • (Graphical Models on Complete Graph) It is possible to estimate statistical behaviour of EM algorithm with belief propagation analytically. • K. Tanaka and D. M. Titterington: J. Phys. A 2007 • Statistical Trajectory of Approximate EM Algorithm for Probabilistic Image Processing University of Glasgow

  6. Contents Introduction Gaussian Graphical Model and EM Algorithm Markov Chain Monte Carlo Method Concluding Remarks University of Glasgow

  7. Noise Bayesian Image Restoration transmission Degraded Image Original Image University of Glasgow

  8. Bayes Formula and Probabilistic Image Processing Prior Probability Degradation Process Original Image Degraded Image Posterior Probability Pixel University of Glasgow

  9. Prior Probability in Probabilistic Image Processing Samples are generated by MCMC. W: Set of all the nodes B: Set of All the Nearest Neighbour pairs of Pixels Markov Chain Monte Carlo Method University of Glasgow

  10. Degradation Process Additive White Gaussian Noise Histogram of Gaussian Random Numbers University of Glasgow

  11. Degradation Process Degradation Process and Prior Prior Probability Density Function Posterior Probability Density Function Multi-Dimensional Gaussian Integral Formula University of Glasgow

  12. Maximization of Marginal Likelihood by EM Algorithm Marginal Likelihood Q-Function EM Algorithm Iterate the following EM-steps until convergence: A. P. Dempster, N. M. Laird and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. Roy. Stat. Soc. B, 39 (1977). University of Glasgow

  13. Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Extremum Condisions of Q(a,s|a(t),s(t),g) w.r.t. a and s = University of Glasgow

  14. Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm University of Glasgow

  15. Statistical Behaviour of EM (Expectation Maximization) Algorithm Numerical Experiments for Standard Image Statistical Behaviour of EM Algorithm University of Glasgow

  16. Contents Introduction Gaussian Graphical Model and EM Algorithm Markov Chain Monte Carlo Method Concluding Remarks University of Glasgow

  17. Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Markov Chain Monte Carlo = University of Glasgow

  18. Markov Chain Monte Carlo Method i Basic Step wi(f(t+1)|f(t)) ci fi(t) fi(t+1) University of Glasgow

  19. Frequency fi Markov Chain Monte Carlo Method Marginal Probabilities can be estimated from histograms. University of Glasgow

  20. Markov Chain Monte Carlo Method MCMC EM University of Glasgow

  21. Markov Chain Monte Carlo Method Input EM MCMC (t=50) MCMC (t=50) MCMC (t=1) Exact Output Input EM MCMC (t=1) Non-Synchronized Update 20 Samples Numerical Experiments for Standard Image Output University of Glasgow

  22. Markov Chain Monte Carlo Method Input EM MCMC (t=50) MCMC (t=50) MCMC (t=1) Exact Output Input EM MCMC (t=1) Non-Synchronized Update 20 Samples Numerical Experiments for Standard Image Output University of Glasgow

  23. Contents Introduction Gaussian Graphical Model and EM Algorithm Markov Chain Monte Carlo Method Concluding Remarks University of Glasgow

  24. Input EM MCMC (t=50) Output Input Input EM EM Exact MCMC (t=1) Output Output Summary We construct EM algorithms by means of Markov Chain Monte Carlo method and compare them with some exact calculations. University of Glasgow

  25. Input EM MCMC (t=1) Output New Project 1 Can we derive the trajectory of EM algorithm by solving the master equations for any step t in the case of t=1? EM Basic Step wi(f(t+1)|f(t)) i fi(t) fi(t+1) ci University of Glasgow

  26. New Project 1 Transition Probability From the solution of master equation, we calculate i ci These are included in the EM update rules. University of Glasgow

  27. Input New Project 2 Input EM Can we replace the calculation of statistical quantities in the prior probability by the MCMC? MCMC Output EM University of Glasgow

  28. New Project 3 Our previous works in EM algorithm and Loopy Belief Propagation • K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: • J. Phys. A 2004 • Hyperparameter Estimation by using Belief Propagation (BP) for Gaussian Graphical Model in Image Processing • K. Tanaka and D. M. Titterington: J. Phys. A 2007 • Statistical Trajectory of Approximate EM Algorithm for Probabilistic Image Processing University of Glasgow

  29. New Project 3 Exact MSE:315 Loopy BP Exact MSE:327 Loopy Belief Propagation University of Glasgow

  30. New Project 3 Loopy BP Exact Numerical Experiments for Standard Image Loopy BP Exact Statistical Behaviour of EM Algorithm University of Glasgow

  31. Input Input BP EM BP EM Output Output New Project 3 • Can we update both messages and hyperparameters in the same step? • Can we calculate the statistical trajectory? More Practical Algorithm University of Glasgow

More Related