1 / 30

Six Degrees of Separation to Darrell Whitley

Six Degrees of Separation to Darrell Whitley. Everything is connected. Improved Squeaky Wheel Optimisation for Driver Scheduling, Uwe Aickelin, Edmund K. Burke, Jingpeng Li. SWO is an algorithm based on a construction-analysis-prioritization cycle.

mrinal
Download Presentation

Six Degrees of Separation to Darrell Whitley

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Six Degrees of Separation to Darrell Whitley

  2. Everything is connected

  3. Improved Squeaky Wheel Optimisation for Driver Scheduling, Uwe Aickelin, Edmund K. Burke, Jingpeng Li • SWO is an algorithm based on a construction-analysis-prioritization cycle. • Improved ISWO introduces selection and mutation within the solution. • Each component must prove its fitness. • Driver scheduling involves partitioning blocks of work (1 vehicle each) into legal shifts • set covering integer linear programming problem.

  4. Search bias in ant colony optimization: on the role of competition-balanced systems Blum, C.; Dorigo, M.

  5. An Evolutionary Approach to the Inference ofPhylogenetic Networks,Juan Diego Trujillo and Carlos Cotta • Trying to find a phylogenetic network that models a set of sequences of molecular data using EAs • Networks include reticulation events (horizontal transfer, recombination, hybridization). • Heuristics based genetic operators. • Fitness function based on likelihood. • Network models close to evolutionary model hidden in the data.

  6. Yao, X. Evolving artificial neural networks (1999) Proceedings of the IEEE, 87 (9), pp. 1423-1447

  7. Genetic Algorithm based on Independent ComponentAnalysis for Global OptimizationGang Li, Kin Hong Lee, Kwong Sak Leung • ICA projects an n-dimensional set to a lower-dimensional space. • Since components are independent of each other, they can be independently maximized. • Solutions are comparable to other algorithms, but fewer fitness evaluations.

  8. X. Yao and Y. Liu. Fast evolution strategies

  9. When Do Heavy-Tail Distributions Help?Hansen, Gemperle, Auger, and Koumoutsakos • Cauchy distribution is an example of heavy-tail. • As opposed to gaussian. • Studies the probability of sampling a better solution using different Cauchy distribution. • Anisotropic Cauchy obtains exceptionally good results on Rastrigin function.

  10. Yao, X., Liu, Y. (1998) Towards designing artificial neural networks by evolution

  11. Neuroevolution with Analog Genetic EncodingPeter Dürr, Claudio Mattiussi, and Dario Floreano • Recent paper by Banzhaf et al. In Nature Reviews-Genetics request following more closely known biological facts in EC to convert it into computational evolution. • This paper uses an encoding for neural nets closer to real genomes • Based on tokens that represent problem objects. • With operators to match

  12. Castillo-Valdivieso, P.A.et al. (2002) Statistical analysis of the parameters of a neuro-genetic algorithm

  13. Assortative mating drastically alters themagnitude of error thresholdsGabriela Ochoa and Klaus Jaffe • Beyond the error threshold, evolved structures cannot be reproduced in the quasispecies evolution model (Eigen and Schuster). • In EC, related to the exploration/exploitation balance. • Assortative mating produces the highest error threshold, whereas asexual reproduction produces the lowest.

  14. Eiben, Hinterding, Michalewicz (1999) Parameter control in evolutionary algorithms

  15. Cumulative Step Length Adaptation on RidgeFunctionsDirk V. Arnold • Ridge functions used to test ES. • This paper studies the performance of multirecombinative ES with cumulative step length adaptation for different ridge topologies.

  16. M. Herdy. Reproductive isolation as strategy parameter in hierarchically organized evolution strategies

  17. Self-Adaptation on the Ridge Function Class: FirstResults for the Sharp RidgeHans-Georg Beyer and Silja Meyer-Nieberg • Different self-adaptation mechanism. • Different ridge: sharp ridge, in this case. • Different ES: non-recombinative strategies. • Why could it fail+

  18. Arabas, J., Michalewicz, Z., Mulawka, J. (1994) GAVaPS, A Genetic Algorithm with Varying Population Size.

  19. Self-regulated Population Size inEvolutionary AlgorithmsCarlos Fernandes and Agostinho Rosa • There are many self-regulated population algorithms: GAVaPS, APGA, ProFiGa: • Funky acronym not a requisite. • Some based on age. • SRP-EA combines CHC and GAVaPS. • Achieves better success rates, with a penalty in the number of evaluations.

  20. Is Self-Adaptation of Selection Pressure andPopulation Size Possible? – a Case StudyA.E. Eiben M.C. Schut A.R. de Wilde • Self-adapting selection operators and population size can yield as good or better results than self-adapting operators. • Selection parameters are encoded in individuals, and a consensus value is reached. • Self-adapting selection increases speed.

  21. Zitzler, E., Laumanns, M., and Thiele, L. (2001). SPEA2: Improving the strength pareto evolutionary algorithm.

  22. Solving Multi-Objective Optimisation Problems Using the Potential Pareto Regions EvolutionaryAlgorithmNasreddine Hallam, Graham Kendall and Peter Blanchfield • Introduces Potential Pareto Regions Evolutionary Algorithm. • The fitness of an individual is equal to the least improvement needed by that individual in order to reach a non-dominated status.

  23. K. Deb, L. Thiele, M. Laumanns, and E. Zitzler (2002). Scalable multi-objective optimization test problems.

  24. Pareto Set and EMOA Behavior for SimpleMultimodal Multiobjective FunctionsMike Preuss, Boris Naujoks, and Günter Rudolph • Studies the often-disregarded Pareto set. • Changes induced in Pareto set alter the ability of algorithms to track Pareto Front. • A measure of the quality of solution in the solution space is needed. • Similar to S-metric in objective space.

  25. About Selecting the Personal Bestin Multi-objective Particle Swarm OptimizationJürgen Branke and Sanaz Mostaghim • Selecting a good guide bodes well for the future of a particle. • But they can also memorize all non-dominated personal best solutions. • Keeping a personal archive yields better results than traditional methods.

  26. A Particle Swarm Optimizer forConstrained Numerical OptimizationCagnina, Esquivel, Coello • Uses a single method to handle all constraints • Results reported using standard test functions.

  27. Modelling Group-ForagingBehaviour with Particle SwarmsCecilia Di Chio, Riccardo Poli, and Paolo Di Chio • Uses a nature-inspired technique to model a naturla problem: group foraging. • Results encouraging.

  28. An evolutive approach for the delineation of locallabour marketsFlorez, Casado, Martinez-Bernabeu • Using a GA for delineating local labour markets in Valencia, Spain. • Uses a bunch of operators: mutation,crossover. • Better results than classical algorithms

  29. Thank you and enjoy the session That's all

More Related