1 / 13

Understanding Few-Shot Learning: Approaches and Advantages

Few-shot learning involves training models with minimal labeled examples, utilizing meta-learning algorithms, metric learning methods, and initialization-based approaches. The addition of compositional regularization enhances classification performance by enforcing constraints. This approach improves results in scenarios like 1-shot classification, allowing for efficient learning from limited data.

chtaibi
Download Presentation

Understanding Few-Shot Learning: Approaches and Advantages

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning from Constraints in the Few-Shot Learning Scenario Gabriele Ciravegna

  2. What is few-shot learning? • The “real application” of Transfer Learning • Classification of data coming from unseen classes (Query Set) • Based on few labelled examples (Support Set) • Very few examples (a.k.a. “shots”) • Different from Semi-Supervised Learning

  3. How it is performed? Meta-learning algorithms: • Few-shot learning as a complete training procedure: predictions always based on • Query Set Qb,n • Support Set Sb,n • “If the prediction is conditioned on a small support set S, because it makes the training procedure explicitly learn to learn from a given small support set”

  4. How it is performed? (2) Metric Learning methods: • They try to learn a metric (distance) that allows to understand which images are similar

  5. How it is performed? (3) Initialization-based methods: • These approaches try to learn a good model initialization: novel classes can be quickly learnt.

  6. A direct approach: “Baseline”[1] • Baseline: standard transfer learning • Data augmentation • Baseline++: • Weight vectors as class prototypes • Similarity score ?????? ???? ??= ?? • P ? ∈ ?? = ?(??) [1] Chen, Wei-Yu, et al. "A Closer Look at Few-shot Classification." ICLR 2018

  7. State of the art comparison

  8. Humans learn new categories with just a few examples Compositional Learning for FSC [2] Limit of DL: Amount of training data Compositional structure of human brain [2] Tokmakov P. et al. "Learning compositional representations for few-shot recognition." ICCV 2019.

  9. Compositional Learning for FSC (2) • Adding a Compositional Regularization to Baseline++ • Category-level attributes • A class is set having a certain attribute if more than 50% of images are labelled • 3 datasets: • CUB • SUN397 • ImageNet

  10. Compositional Regularization A representation ??? is defined compositional over ?0if: • ???? = ?∈D ????? Loss with hard constraints: • ?ℎ?,? = ?? ????, ?∈D ????? Loss with soft constraints: • ???,? = ??∈D ?????? ∙ ??? Overall loss: • ? ?,? = ????? + ????,? + ? ???− ?

  11. Compositional Learning for FSC - Results

  12. Learning from Constraints in FSC Similar learning scenario Improve classification performances Advantage • Constraints can be forced also during fine-tuning • ? ? = ????? + ???? • All evaluation details remain the same • Especially in the 1-shot classification scenario

  13. Thank you for your attention

More Related