1 / 21

Support Vector Machine

Le Do Hoang Nam – CNTN08. Support Vector Machine. Linear Programming. General Form with x in R n Linear objective, Linear constraints, …. Linear Programming. An example: The Diet Problem How to come up with a cheapest meal that meets all nutrition standards ?. Linear Programming.

Download Presentation

Support Vector Machine

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Le Do Hoang Nam – CNTN08 Support Vector Machine

  2. Linear Programming • General Form with x in Rn • Linear objective, Linear constraints, …

  3. Linear Programming • An example: The Diet Problem • How to come up with a cheapest meal that meets all nutrition standards?

  4. Linear Programming • Let x1, x2 and x3 be the amount in kilos of carrot, cabbage and cucumber in the dish. • Mathematically,

  5. Linear Programming • In canonical form: • How to solve? • Simplex. • Newton method. • Gradient descend.

  6. LP and Classification • Given a set of N samples (mi, li) • mi is the feature set. • li = -1 or 1 is the label. • If a sample is correctly classified by a hyper-plane wTx + c then: li (wTmi + c) ≥ 1 linear function

  7. LP and Classification • (w, c) is a good classification if it satisfies: li (wTmi + c) ≥ 1 , i = 1..n which are linear constraints  LP form:

  8. LP and Classification • Without any objective function, we have ALL possible solutions: Class 2 Class 2 Class 1 Class 1

  9. LP and Classification • If data is not linearly separable: •  Minimize number of errors Class 2 Class 1

  10. LP and Classification • Our objective becomes: • But, cardinal function is non-linear  not an LP

  11. LP and Classification • Cardinal function: f(x) • Solution:Approximate it with Hinge-loss function. 1 1 O x

  12. LP and Classification • Hinge-loss function: • Or: f(x) 1 1 O x

  13. LP and Classification • Classification problem now becomes: which can be solved as an LP

  14. LP and Classification • Geometry view: wTx + c = 1 Class 2 εi mi wTx + c = -1 mj εj wTx + c = 0 Class 1

  15. LP and Classification • Another problem: Some samples are uncertain Class 2 Class 1

  16. LP and Classification • Solution: Maximum the margin d. Class 2 d Class 1

  17. LP and Classification • All samples are outside the margin • All the distances from samples to boundary are bigger than d/2. That means:

  18. LP and Classification • Because hyper-plane is homogenous, we choose w such as: • The objective function:

  19. LP and Classification • The problem now becomes:

  20. Support Vector Machine • Together with the error minimization, we have the SVM: • λ means the trade-off between error and robustness

  21. Kernel Method

More Related