1 / 25

Artificial Intelligence 6. Machine Learning, Version Space Method

Artificial Intelligence 6. Machine Learning, Version Space Method. Japan Advanced Institute of Science and Technology (JAIST) Yoshimasa Tsuruoka. Outline. Introduction to machine learning What is machine learning? Applications of machine learning Version space method

kishi
Download Presentation

Artificial Intelligence 6. Machine Learning, Version Space Method

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Intelligence6. Machine Learning, Version Space Method Japan Advanced Institute of Science and Technology (JAIST) Yoshimasa Tsuruoka

  2. Outline • Introduction to machine learning • What is machine learning? • Applications of machine learning • Version space method • Representing hypotheses, version space • Find-Salgorithm • Candidate-Elimination algorithm • http://www.jaist.ac.jp/~tsuruoka/lectures/

  3. Recognizing handwritten digits Hastie, Tibshirani and Friedman (2008). The Elements of Statistical Learning (2nd edition). Springer-Verlag.

  4. Natural language processing • GENIA tagger • Tokenization • Part-of-speech tagging • Shallow parsing • Named entity recognition http://www-tsujii.is.s.u-tokyo.ac.jp/GENIA/tagger/

  5. Applications of machine learning • Image/speech recognition • Part-of-speech tagging, syntactic parsing, word sense disambiguation • Detection of spam emails • Intrusion detection • Credit card fraud detection • Automatic driving • AI players in computer games • etc.

  6. Types of machine learning • Supervised learning • “correct” output is given for each instance • Unsupervised learning • No output is given • Analyses relations between instances • Reinforcement learning • Supervision is given via “rewards”

  7. Application of Unsupervised learning • Search engine + clustering http://clusty.com

  8. Reinforcement learning • Autonomous helicopters and robots Inverted autonomous helicopter flight via reinforcement learning, Andrew Y. Ng, Adam Coates, Mark Diel, Varun Ganapathi, Jamie Schulte, Ben Tse, Eric Berger and Eric Liang. In International Symposium on Experimental Robotics, 2004 Quadruped Robot Obstacle Negotiation via Reinforcement Learning, Honglak Lee, Yirong Shen, Chih-Han Yu, Gurjeet Singh, and Andrew Y. Ng. In Proceedings of the IEEE International Conference on Robotics and Automation , 2006

  9. What is machine learning? • What does a machine learn? • What “machine learning” can do: • Classification, regression, structured prediction • Clustering • Machine learning involves • Numerical optimization, probabilistic modeling, graphs, search, logic, etc.

  10. Why machine learning? • Why not write rules manually? • Detecting spam emails • If the mail contains the word “Nigeria” then it is a spam • If the mail comes from IP X.X.X.X then it is a spam • If the mail contains a large image then it is a spam • … • Too many rules • Hard to keep consistency • Each rule may not be completely correct

  11. Version space methodChapter 2 of Mitchell, T., Machine Learning (1997) • Concept Learning • Training examples • Representing hypotheses • Find-Salgorithm • Version space • Candidate-Elimination algorithm

  12. Learning a concept with examples • Training examples • The concept we want to learn • Days on which my friend Aldo enjoys his favorite water sports attributes

  13. Hypotheses • Representing hypotheses h1 = <Sunny, ?, ?, Strong, ?, ?> Weather = Sunny, Wind = Strong (the other attributes can take any values) h2 = <Sunny, ?, ?, ?, ?, ?> Weather = Sunny • General andSpecific h1 is more specific than h2 (h2 is more general than h1)

  14. Find-S Algorithm • Initialize h to the most specific hypothesis in H • For each positive training instance x • For each attribute constraint ai in h • If the constraint ai is satisfied by x • Then do nothing • Else replace aiin h by the next more general constraint that is satisfied by x • Output hypothesis h

  15. Example h0= <0, 0, 0, 0, 0, 0> x1= <Sunny, Warm, Normal, Strong, Warm, Same>, yes h1= <Sunny, Warm, Normal, Strong, Warm, Same> x2= <Sunny, Warm, High, Strong, Warm, Same>, yes h2= <Sunny, Warm, ?, Strong, Warm, Same> x3= <Rainy, Cold, High, Strong, Warm, Change>, no h3= <Sunny, Warm, ?, Strong, Warm, Same> x4= <Sunny, Warm, High, Strong, Cool, Change>, yes h4= <Sunny, Warm, ?, Strong, ?, ?>

  16. Problems with the Find-Salgorithm • It is not clear whether the output hypothsis is the “correct” hypothesis • There can be other hypotheses that are consistent with the training examples. • Why prefer the most specific hypothesis? • Cannot detect when the training data is inconsistent

  17. Version Space • Definition • Hypothesis space H • Training examplesD • Version space: • The subset of hypotheses from H consistent with the training examples in D

  18. LIST-THEN-ELIMINATE algorithm • VersionSpace ← initialized with a list containing every hypothesis in H • For each training example, <x, c(x)> • Remove from VersionSpace any hypothesis h for which h(x) ≠ c(x) • Output the list of hypothesis in VersionSpace

  19. Version Space • Specific boundary and General boundary S: { <Sunny, Warm, ?, Strong, ?, ?> } <Sunny, ?, ?, Strong, ?, ?> <Sunny, Warm, ?, ?, ?, ?> <?, Warm, ?, Strong, ?, ?> G: { <Sunny, ?, ?, ?, ?, ?>, <?, Warm, ?, ?, ?, ?> } • The version space can be represented with S and G. You don’t have to list all the hypotheses.

  20. Candidate-Elimination algorithm • Initialization • G: the set of maximally general hypotheses in H • S: the set of maximally specific hypotheses in H • For each training example d, do • If d is a positive example • Remove from Gany hypothesis inconsistent with d • For each hypothesis s in S that is not consistent with d • Remove s from S • Add to Sall minimal generalization h of s such that • h is consistent with d, and some member of G is more general than h • Remove from S any hypothesis that is more general than another hypothesis in S • If d is a negative example • …

  21. Example 1st training example <Sunny, Warm, Normal, Strong, Warm, Same>, yes S0: { <0, 0, 0, 0, 0, 0> } S1: { <Sunny, Warm, Normal, Strong, Warm, Same> } G0, G1: { <?, ?, ?, ?, ?, ?> }

  22. Example 2nd training example <Sunny, Warm, High, Strong, Warm, Same>, yes S1: { <Sunny, Warm, Normal, Strong, Warm, Same> } S2: { <Sunny, Warm, ?, Strong, Warm, Same> } G0, G1 , G2 : { <?, ?, ?, ?, ?, ?> }

  23. Example 3rd training example <Rainy, Cold, High, Strong, Warm, Change>, no S2,S3 :{ <Sunny, Warm, ?, Strong, Warm, Same> } G3: { <Sunny, ?, ?, ?, ?, ?> <?, Warm, ?, ?, ?, ?> <?, ?, ?, ?, ?, Same> } G2 : { <?, ?, ?, ?, ?, ?> }

  24. Example 4th training example <Sunny, Warm, High, Strong, Cool, Change>, yes S3 :{ <Sunny, Warm, ?, Strong, Warm, Same> } S4 :{ <Sunny, Warm, ?, Strong, ?, ?> } G4: { <Sunny, ?, ?, ?, ?, ?> <?, Warm, ?, ?, ?, ?> } G3: { <Sunny, ?, ?, ?, ?, ?> <?, Warm, ?, ?, ?, ?> <?, ?, ?, ?, ?, Same> }

  25. The final version space S4 :{ <Sunny, Warm, ?, Strong, ?, ?> } <Sunny, ?, ?, Strong, ?, ?> <Sunny, Warm, ?, ?, ?, ?> <?, Warm, ?, Strong, ?, ?> G4: { <Sunny, ?, ?, ?, ?, ?> <?, Warm, ?, ?, ?, ?> }

More Related