1 / 27

Particle Swarm Optimization

03 26 2008. Particle Swarm Optimization. Particle Swarm Optimization (PSO) . Kennedy, J., Eberhart , R. C. (1995). Particle swarm optimization. Proc. IEEE International Conference on Neural Networks (Perth, Australia), lEEE Service Center, Piscataway, NJ, pp. IV: 1942- 1948.

dagmar
Download Presentation

Particle Swarm Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 03 26 2008 Particle Swarm Optimization

  2. Particle Swarm Optimization (PSO) • Kennedy, J., Eberhart, R. C. (1995). Particle swarm optimization. Proc. IEEE International Conference on Neural Networks (Perth, Australia), lEEE Service Center, Piscataway, NJ, pp. IV: 1942- 1948.

  3. Behavior of Flock of Birds

  4. Behavior of Flock of Birds

  5. Behavior of Flock of Birds

  6. Behavior of Flock of Birds

  7. Behavior of Flock of Birds

  8. Behavior of Flock of Birds • Self-Experience • Success of Others Self-Experience v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p gd - x id) x id = x id + v id Success of Others

  9. PSO Equation v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p gd - x id) x id = x id + v id Inertia Self-Experience Success of Others ith Particle Velocity: v i Position : x i Global Best Position : p g Previous Best Position : p i

  10. Optimization Problem Parameter Adjustment Output Input System System_1 System_2 Output Input System_3 … System_n n particles

  11. Particle Swarm Optimization v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p gd - x id) x id = x id + v id Cost Cost Cost xk+1 x x x Iteration Inertia Vg xk …… Vp xk-1

  12. Inertia Weight xk+1 Small Inertia Weight Large Inertia Weight W: inertia weight xk+1 v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p gd - x id) x id = x id + v id Vg Vg xk xk Vp Vp xk-1 xk-1

  13. Inertia Weight Large Inertia Weight Cost Cost W: inertia weight v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p gd - x id) x id = x id + v id Small Inertia Weight x x Global Search Large Inertia Weight Local Search Small

  14. Fuzzy Adaptive PSO Cost • Kennedy, J., Eberhart, R. C. (2001).“Fuzzy adaptive particle swarm optimization,” in Proc. IEEE Int. Congr. Evolutionary Computation, vol. 1, 2001, pp. 101–106. Normalized Current Best Performance Evaluation (NCBPE) Global Search Large x Inertia Weight Fuzzy Adaptive Local Search Small CBPEmax CBPE CBPEmin

  15. Fuzzy Adaptive PSO A description of a fuzzy system for adapting the inertia weight of PSO. Global Search Large Membership Membership Inertia Weight 1 Fuzzy Adaptive L M H NCBPE 0 Local Search Small Membership Membership 1 L M H Weight 0 Fuzzy Rule Membership Membership 1 Fuzzy Rule L M H 0 W_Change

  16. Experimental Results Linearly Decreasing Inertia Weight Minimization The performance of PSO is not sensitive to the population size, and the scalability of the PSO is acceptable. Fuzzy Adaptive Inertia Weight

  17. Application Example1 • Feature Training for Face Detection … … … … Iteration 1 Iteration 2 Iteration k

  18. Application Example2 • Neural Network Training V.G. Gudisz, G.K. Venayagamoorthy, Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks, in: IEEE Swarm Intelligence Symposium 2003 (SIS 2003), Indianapolis, IN, 2003, pp. 110–117.

  19. Introduction of Neural Network ai= W ijX for i=1 to 4, j=1,2 Where X = [x 1]T di = 1 / (1-eai) y = [V1 V2 V3 V4 ][d1 d2 d3 d4 ] T

  20. Neural Network Training • Backpropagation • PSO

  21. Neural Network Training • Backpropagation • PSO

  22. Neural Network Training • Backpropagation • PSO Parameter Set of PSO

  23. Training Results Training 2x4x1 neural network to fit y = 2x2+1 Mean square error curve of neural networks during mining with BP and PSO for bias 1 Test curve for trained neural networks with fixed weights obtained from BP and PSO training algorithm with bias 1

  24. Conclusions • The concept of PSO is introduced. • PSO is an extremely simple algorithm for global optimization problem. • Low memory cost • Low computational cost • Fuzzy system is implemented to dynamically adjust the inertia weight to improve the performance of PSO.

More Related