1 / 19

Gaussian Mixture Model classification of Multi-Color Fluorescence In Situ Hybridization M-FISH Images

Thursday, June, 2006. CS and EE DepartmentUMKC. 2/15. Motivation and Goals. Chromosomes store genetic informationChromosome images can indicate genetic disease, cancer, radiation damage, etc.Research goals:Locate and classify each chromosome in an imageLocate chromosome abnormalities. Thursda

MikeCarlo
Download Presentation

Gaussian Mixture Model classification of Multi-Color Fluorescence In Situ Hybridization M-FISH Images

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Gaussian Mixture Model classification of Multi-Color Fluorescence In Situ Hybridization (M-FISH) Images Amin Fazel 2006

    2. Thursday, June, 2006 CS and EE Department UMKC 2/15 Motivation and Goals Chromosomes store genetic information Chromosome images can indicate genetic disease, cancer, radiation damage, etc. Research goals: Locate and classify each chromosome in an image Locate chromosome abnormalities

    3. Thursday, June, 2006 CS and EE Department UMKC 3/15 Karyotyping 46 human chromosomes form 24 types 22 different pairs 2 sex chromosomes, X and Y Grouped and ordered by length

    4. Thursday, June, 2006 CS and EE Department UMKC 4/15 Multi-spectral Chromosome Imaging Multiplex Fluorescence In-Situ Hybridization (M-FISH) [1996] Five color dyes (fluorophores) Each human chromosome type absorbs a unique combination of the dyes 32 (25) possible combinations of dyes distinguish 24 human chromosome types

    5. Thursday, June, 2006 CS and EE Department UMKC 5/15 M-FISH Images 6th dye (DAPI) binds to all chromosomes

    6. Thursday, June, 2006 CS and EE Department UMKC 6/15 M-FISH Images Images of each dye obtained with appropriate optical filter Each pixel a six dimensional vector Each vector element gives contribution of a dye at pixel Chromosomal origin distinguishable at single pixel (unless overlapping) Unnecessary to estimate length, relative centromere position, or banding pattern

    7. Thursday, June, 2006 CS and EE Department UMKC 7/15 Bayesian Classification Based on probability theory A feature vector is denoted as x = [x1; x2; : : : ; xD]T D is the dimension of a vector The probability that a feature vector x belongs to class wk is p(wk|x) and this posteriori probability can be computed via and

    8. Thursday, June, 2006 CS and EE Department UMKC 8/15 Gaussian Probability Density Function In the D-dimensional space is the mean vector is the covariance matrix In the Gaussian distribution lies an assumption that the class model is truly a model of one basic class

    9. Thursday, June, 2006 CS and EE Department UMKC 9/15 Gaussian mixture model GMM GMM is a set of several Gaussians which try to represent groups / clusters of data therefore represent different subclasses inside one class The PDF is defined as a weighted sum of Gaussians

    10. Thursday, June, 2006 CS and EE Department UMKC 10/15 Gaussian Mixture Models Equations for GMMs: multi-dimensional case: ? becomes vector ?, ? becomes covariance matrix ?. assume ? is diagonal matrix:

    11. Thursday, June, 2006 CS and EE Department UMKC 11/15 GMM Gaussian Mixture Model (GMM) is characterized by the number of components, the means and covariance matrices of the Gaussian components the weight (height) of each component

    12. Thursday, June, 2006 CS and EE Department UMKC 12/15 GMM GMM is the same dimension as the feature space (6-dimensional GMM) for visualization purposes, here are 2-dimensional GMMs:

    13. Thursday, June, 2006 CS and EE Department UMKC 13/15 GMM These parameters are tuned using a iterative procedure called the Expectation Maximization (EM) EM algorithm: recursively updates distribution of each Gaussian model and conditional probability to increase the maximum likelihood.

    14. Thursday, June, 2006 CS and EE Department UMKC 14/15 GMM Training Flow Chart (1) Initialize the initial Gaussian means µi using the K-means clustering algorithm Initialize the covariance matrices to the distance to the nearest cluster Initialize the weights 1 / C so that all Gaussian are equally likely

    15. Thursday, June, 2006 CS and EE Department UMKC 15/15 GMM Training Flow Chart (2)

    16. Thursday, June, 2006 CS and EE Department UMKC 16/15 GMM Training Flow Chart (3) recompute wn,c using the new weights, means and covariances. Stop training if wn+1,c - wn,c < threshold Or the number of epochs reach the specified value. Otherwise, continue the iterative updates.

    17. Thursday, June, 2006 CS and EE Department UMKC 17/15 GMM Test Flow Chart Present each input pattern x and compute the confidence for each class k: Where is the prior probability of class ck estimated by counting the number of training patterns Classify pattern x as the class with the highest confidence.

    18. Thursday, June, 2006 CS and EE Department UMKC 18/15 Results

    19. Thursday, June, 2006 CS and EE Department UMKC 19/15 Results

    20. Thursday, June, 2006 CS and EE Department UMKC 20/15

More Related