1 / 15

PSMS for Neural Networks on the Agnostic vs Prior Knowledge Challenge

PSMS for Neural Networks on the Agnostic vs Prior Knowledge Challenge. Hugo Jair Escalante, Manuel Montes and Enrique Sucar Computer Science Department National Astrophysics, Optics and Electronics, México. IJCNN-2007 ALvsPK Challenge. Orlando, Florida, August 17, 2007. Outline.

marlo
Download Presentation

PSMS for Neural Networks on the Agnostic vs Prior Knowledge Challenge

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PSMS for Neural Networks on the Agnostic vs Prior Knowledge Challenge Hugo Jair Escalante, Manuel Montes and Enrique Sucar Computer Science Department National Astrophysics, Optics and Electronics, México IJCNN-2007 ALvsPK Challenge Orlando, Florida, August 17, 2007

  2. Outline • Introduction • Particle swarm optimization • Particle swarm model selection • Results • Conclusions

  3. Introduction: model selection • Agnostic learning • General purpose methods • No knowledge on the task at hand or on machine learning is required • Prior knowledge • Prior knowledge can increases model’s accuracy • Expert domain is needed

  4. Introduction • Problem: Given a set of preprocessing methods, feature selection and learning algorithms (CLOP), select the best combination of them, together with their hyperparameters • Solution: Bio-inspired search strategy (PSO) Fish schooling Bird flocking

  5. Particle swarm optimization (PSO) • A population of individuals is created (Swarm) • Each individual (particle) represents a solution to the problem at hand • Particles fly through the search space by considering the best global and individual solutions • A fitness function is used for evaluating solutions

  6. Particle swarm optimization (PSO) • Begin • Initialize swarm • Locate leader (pg) • it=0 • While it < max_it • For each particle • Update Position (2) • Evaluation (fitness) • Update particle’s best (p) • EndFor • Update leader (pg) • it++; • EndWhile • End

  7. PSO for model selection (PSMS) • Each particle encodes a CLOP model • Cross-validation BER is used for evaluating models

  8. Experiment's settings • Standard parameters for PSO • 10 particles per swarm • PSMS applied to ADA, GINA, HIVA and SYLVA • 5-cross validation was used

  9. Results up to March 1st Corrida_final • 500 iterations for ADA • 100 iterations for HIVA, GIVA • 50 iterations for SYLVA • Trial and error for NOVA

  10. Results up to March 1st Agnostic learning best ranked entries as of March 1st, 2007 Best ave. BER still held by Reference (Gavin Cawley) with “the bad”. Note that the best entry for each dataset is not necessarily the best entry overall. Some of the best agnostic entries of individual datasets were made as part of prior knowledge entries (the bottom four); there is no corresponding overall agnostic ranking.

  11. Results up to March 1st 500 Iterations (ADA) 100 Iterations

  12. Results up to August 1st * Models selected by trial and error

  13. Results up to August 1st Agnostic learning best ranked entries as of August 1st, 2007 Best ave. BER still held by Reference (Gavin Cawley) with “the bad”. Note that the best entry for each dataset is not necessarily the best entry overall. The blue shaded entries did not count towards the prize (participant part of a group or not wishing to be identified).

  14. Results up to August 1st AUC BER

  15. Conclusions • Competitive and simple models are obtained with PSMS • No knowledge on the problem at hand neither on machine learning is required • PSMS is easy to implement • It suffers from the same problem as other search algorithms

More Related