1 / 34

Numerical Analysis – Linear Equations(II)

Numerical Analysis – Linear Equations(II). Hanyang University Jong-Il Park. Singular Value Decomposition(SVD). Why SVD? G aussian Elim. and LU Decomposition fail to give satisfactory results for singular or numerically near singular matrices

zev
Download Presentation

Numerical Analysis – Linear Equations(II)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Numerical Analysis – Linear Equations(II) Hanyang University Jong-Il Park

  2. Singular Value Decomposition(SVD) • Why SVD? • Gaussian Elim. and LU Decomposition fail to give satisfactory results for singular or numerically near singular matrices • SVD can cope with over- or under-determined problems • SVD constructs orthonormal basis vectors

  3. What is SVD? Any MxN matrix A whose number of rows M is greater than or equal to its number of columns N, can be written as the product of a MxN column-orthogonal matrix U, an NxN diagonal matrix W with positive or zero elements(the singular values), and the transpose of an NxN orthogonal matrix V: A = U W VT

  4. Properties of SVD • Orthonormality • UTU=I=> U-1=UT • VTV=I=> V-1=VT • Uniqueness • The decomposition can always be done, no matter how singular the matrix is, and it is almost unique.

  5. SVD of a square matrix • SVD of a Square Matrix • columns of U • an orthonormal set of basis vectors • columns of V whose corresponding wj's are zero • an orthonormal basis for the nullspace

  6. Reminding basic concept in linear algebra

  7. Homogeneous equation • Homegeneous equations (b=0) + A is singular • Any columns of V whose corresponding wjis zero yields a solution

  8. Nonhomogeneous equation • Nonhomegeneous eq. with singular A where we replace 1/wj by zero if wj=0 ※ The solution x obtained by this method is the solution vector of the smallest length. ※ If b is not in the range of A SVD find the solution x in the least-square sense, i.e. x which minimize r = |Ax-b|

  9. SVD solution - concept

  10. SVD – under/over-determined problems • SVD for Fewer Equations than Unknowns • SVD for More Equations than Unknowns • SVD yields the least-square solution • In general, non-singular

  11. Applications of SVD • Applications • Constructing an orthonormal basis • M-dimensional vector space • Problem: Given N vectors, find an orthonormal basis • Solution: Columns of the matrix U are the desired orthonormal basis • Approximation of Matrices

  12. Vector norm

  13. l2 norm

  14. l norm 8

  15. Distance between vectors

  16. Natural matrix norm Eg.

  17. l norm of a matrix 8

  18. Eigenvalues and eigenvectors * To be discussed later in detail.

  19. Spectral radius

  20. Convergent matrix equivalences

  21. Convergence of a sequence • An important connection between the eigen values of the matrix T and the expectation that the iterative method will converge  spectral radius

  22. Iterative Methods - Jacobi Iteration • Jacobi Iteration

  23. Jacobi Iteration • Initial guess • Convergence Condition The Jacobi iteration is convergent, irrespective of an initial guess, if the matrix A is diagonal-dominant:

  24. Eg. Jacobi iteration

  25. Gauss-Seidel Iteration • Idea • Utilize recently updated • Iteration formula • Convergence Condition • The same as the Jacobi iteration • Advantage over Jacobi iteration • Fast convergence

  26. Eg. Gauss-Seidel iteration

  27. Jacobi vs. Gauss-Seidel • Comparison: Eg. 1 vs. Eg. 2 Eg. 1 Jacobi Faster convergence Eg. 2 Gauss-Seidel

  28. Variation of Gauss-Seidel Iteration • Successive Over Relaxation(SOR) • fast convergence • Well-suited for linear problem • Successive Under Relaxation(SUR) • slow convergence • stable • Well-suited for nonlinear problem

  29. Eg. Gauss-Seidel vs. SOR

  30. (cont.) Faster convergence

  31. Iterative Improvement of a Solution • Iterative improvement • exact solution • contaminated solution • wrong product • Algorithm derivation since Solving the equation to get the improvement Repeat this procedure until a convergence

  32. Numerical Aspect of Iterative Improvement • Numerical aspect • Using LU decomposition • Once we get a decomposition, just a substitution using will give the incremental correction • (Refer to mprove() in p.56 of NR in C) • Measure of ill-conditioning • A’: normalized matrix • If A’-1 is large  ill-conditioned  Read Box 10.1

  33. Application: Circuit analysis • Kirchhoff’s current and voltage law Current rule: 4 nodes Voltage rule: 2 meshes 6 unknowns, 6 equations

More Related