1 / 17

Lecture 20 SVD and Its Applications

Lecture 20 SVD and Its Applications. Shang-Hua Teng. Spectral Theorem and Spectral Decomposition. Every symmetric matrix A can be written as. where x 1 … x n are the n orthonormal eigenvectors of A, they are the principal axis of A. x i x i T is the projection matrix on to x i !!!!!.

kaloosh
Download Presentation

Lecture 20 SVD and Its Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 20SVD and Its Applications Shang-Hua Teng

  2. Spectral Theorem and Spectral Decomposition Every symmetric matrix A can be written as where x1 …xn are the n orthonormal eigenvectors of A, they are the principal axis of A. xi xiT is the projection matrix on to xi !!!!!

  3. Singular Value Decomposition • Any m by n matrix A may be factored such that A = UVT • U: m by m, orthogonal, columns • V: n by n, orthogonal, columns • : m by n, diagonal, r singular values

  4. · · 0 A U S VT = 0 m x n m x m m x n n x n The Singular Value Decomposition r = the rank of A = number of linearly independent columns/rows

  5. SVD Properties • U, V give us orthonormal bases for the subspaces of A: • 1st r columns of U:Column space of A • Last m - r columns of U: Left nullspace of A • 1st r columns of V: Row space of A • 1st n - r columns of V: Nullspace of A • IMPLICATION: Rank(A) = r

  6. · · The Singular Value Decomposition 0 A U S VT = 0 m x n m x m m x n n x n 0 A U S VT 0 = m x n m x r r x r r x n

  7. Singular Value Decomposition • where • u1 …ur are the r orthonormal vectors that are basis of C(A) and • v1 …vr are the r orthonormal vectors that are basis of C(AT )

  8. SVD Proof • Any m x n matrix A has two symmetric covariant matrices (m x m) AAT (n x n) ATA

  9. Spectral Decomposition of Covariant Matrices • (m x m) AAT =U L1 UT • U is call the left singular vectors of A • (n x n)ATA = V L2 VT • V is call the right singular vectors of A • Claim: are the same

  10. Singular Value Decomposition Proof

  11. All Singular Values are non Negative

  12. Row and Column Space Projection • Suppose A is an m by n matrix that has rank r and r << n, and r << m. • Then A has r non-zero singular values • Let A = U S VT be the SVD of A where S is an r by r diagonal matrix • Examine:

  13. · The Singular Value Projection 0 A U S VT 0 = m x n m x r r x r r x n

  14. Therefore • Rows of U S are r dimensional projections of rows of A • Columns of SVT are r dimensional projections of columns of A • So we can compute their distances or dot products in a lower dimensional space

  15. Eigenvalues and Determinants • Product law: • Summation Law: Both can be proved by examining the characteristic polynomial

  16. Eigenvalues and Pivots If A is symmetric the number of positive (negative) eigenvalues equals to the number of positive (negative) pivots A = LDL T Topological Proof: scale down the off-diagonal entries of L continuously to 0, i.e., moving L continuously to I. Any change sign in eigenvalue must cross 0

  17. Next Lecture • Dimensional reduction for Latent Semantic Analysis • Eigenvalue Problems in Web Analysis

More Related