1 / 3

Lecture 14

Lecture 14. PCA, pPCA, ICA. Principal Components Analysis. PCA is a data analysis technique to find the subspace of input space that carries most of the variance of the data. It is therefore useful as a tool to reduce the dimensionality of input space.

reegan
Download Presentation

Lecture 14

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 14 PCA, pPCA, ICA

  2. Principal Components Analysis • PCA is a data analysis technique to find the subspace of input space that carries most of the variance of the data. • It is therefore useful as a tool to reduce the dimensionality of input space. • The solution is found by an eigen-value decomposition of the sample covariance matrix. • PPCA is a probabilistic model that has ML solution equal to the PCA solution. It is a special case of FA with isotropic variance. • Therefore, the EM algorithm for FA is applicable for learning.

  3. Independent Component Analysis • FA, PPCA have Gaussian prior models. In ICA we use non-Gaussian prior models (i.e. heavy tailed or bi-modal). • We also do not insist on dimensionality reduction, although that is also possible, but not necessary. • The canonical example is 2 speakers producing different mixtures of sound in 2 microphones that we wish to unmix. • The source distributions are non-Gaussian but independent, the noise model is typically Gaussian. • The simplest ICA model is square and has no noise. We can use a change of variable to go from sources to inputs. • Learning is through gradient descend with the ``natural gradient’’.

More Related