1 / 56

Metaheuristic Methods and Their Applications

Metaheuristic Methods and Their Applications. Lin-Yu Tseng ( 曾怜玉 ) Institute of Networking and Multimedia Department of Computer Science and Engineering. National Chung Hsing University. OUTLINE. I. Optimization Problems II. Strategies for Solving NP-hard Optimization Problems

mgoddard
Download Presentation

Metaheuristic Methods and Their Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Metaheuristic Methods and Their Applications Lin-Yu Tseng (曾怜玉) Institute of Networking and Multimedia Department of Computer Science and Engineering National Chung Hsing University

  2. OUTLINE I. Optimization Problems II. Strategies for Solving NP-hard Optimization Problems III. What is a Metaheuristic Method? IV. Trajectory Methods V . Population-Based Methods VI. Multiple Trajectory Search VII. The Applications of Metaheuristic Methods VIII. Conclusions

  3. I. Optimization Problems Computer Science • Traveling Salesman Problem • Maximum Clique Problem Operational Research • Flow Shop Scheduling Problem • P – Median Problem Many optimization problems are NP-hard.

  4. Optimization problems Calculus-based method

  5. Hill climbing

  6. How about this?

  7. 23 1 3 10 5 2 3 6 20 15 4 5 7 An example : TSP (Traveling Salesman Problem) 18 A solution A sequence 12345 Tour length = 31 13452 Tour length = 63 There are (n-1)! tours in total

  8. II. Strategies for Solving NP-hard Optimization Problems • Branch-and-Bound  Find exact solution • Approximation Algorithms • e.g. There is an approximation algorithm for TSP which can find a tour with tour length ≦ 1.5× (optimal tour length) in O(n3) time. • Heuristic Methods  Deterministic • Metaheuristic Methods  Heuristic + Randomization

  9. III. What is a Metaheruistic Method? • Meta : in an upper level • Heuristic : to find A metaheuristic is formally defined as an iterative generation process which guides a subordinate heuristic by combining intelligently different concepts for exploring and exploiting the search space, learning strategies are used to structure information in order to find efficiently near-optimal solutions. [Osman and Laporte 1996].

  10. Fundamental Properties of Metaheuristics [Blum and Roli 2003] • Metaheuristics are strategies that “guide” the search process. • The goal is to efficiently explore the search space in order to find (near-)optimal solutions. • Techniques which constitute metaheuristic algorithms range from simple local search procedures to complex learning processes. • Metaheuristic algorithms are approximate and usually non-deterministic.

  11. Fundamental Properties of Metaheuristics (cont.) • They may incorporate mechanisms to avoid getting trapped in confined areas of the search space. • The basic concepts of metaheuristics permit an abstract level description. • Metaheuristics are not problem-specific. • Metaheuristics may make use of domain-specific knowledge in the form of heuristics that are controlled by the upper level strategy. • Today’s more advanced metaheuristics use search experience (embodied in some form of memory) to guide the search.

  12. IV. Trajectory Methods 1. Basic Local Search : Iterative Improvement • Improve (N(S)) can be • First improvement • Best improvement • Intermediate option

  13. x1 X4 x3 x2 X5 Trajectory Methods

  14. 2. Simulated Annealing – Kirkpatrick (Science 1983) • Probability • Temperature T may be defined as • Random walk + iterative improvement

  15. 3. Tabu Search – Glover 1986 • Simple Tabu Search • Tabu list • Tabu tenure: the length of the tabu list • Aspiration condition

  16. Tabu Search

  17. 4. Variable Neighborhood Search – Hansen and Mladenović 1999 • Composed of three phase ① shaking ② local search ③ move • A set of neighborhood structures,

  18. x x’’ Best x’’ N1 N1 x’’ Best N2 x’ x’’ x’ x’ N1 x’’ x x x x’’ x Best x’ x Best Variable Neighborhood Search

  19. V. Population-Based Methods 1. Genetic Algorithm – Holland 1975 • Coding of a solution --- Chromosome • Fitness function --- Related to objective function • Initial population

  20. 23 1 3 10 5 2 3 6 20 15 4 5 7 An example : TSP(Traveling Salesman Problem) 18 A solution A sequence 12345 Tour length = 31 13452 Tour length = 63

  21. Genetic Algorithm • Reproduction (Selection) • Crossover • Mutation

  22. Example 1 Maximize f(x) = x2 where x  I and 0  x  31

  23. Coding of a solution : A five-bit integer, e.g. 01101 • Fitness function : F(x) = f(x) = x2 • Initial population : (Randomly generated) 0110111000 0100010011

  24. Reproduction Roulette Wheel

  25. Reproduction

  26. Crossover

  27. Mutation The probability of mutation Pm = 0.001 20 bits * 0.001 = 0.02 bits

  28. Example 2 Word matching problem • “to be or not to be”  “tobeornottobe” • (1/26)13 = 4.03038  10-19 • The lowercase letters in ASCII are represented by numbers in the range [97, 122] [116,111,98,101,114,110, 116, 116, 111, 98,101]

  29. Initial population

  30. The corresponding strings rzfqdhujardbe niedwrvrjahfj cyueisosqvcb fbfvgramtekuvs kbuqrtjtjensb fwyqykktzyojh tbxblsoizggwm dtriusrgkmbg jvpbgemtpjalq

  31. 2. Ant Colony Optimization – Dorigo 1992 Pheromone

  32. Initialization Loop Loop Each ant applies a state transition rule to incrementally build a solution and applies a local updating rule to the pheromone Until each of all ants has built a complete solution A global pheromone updating rule is applied Until End_Condition

  33. 23 1 3 10 5 2 3 6 20 15 4 5 7 Example : TSP A simple heuristics – a greedy method: choose the shortest edge to go out of a node 18 Solution : 15432

  34. Each ant builds a solution using a step by step constructive decision policy. • How ant k choose the next city to visit? Where , be a distance measure associated with edge (r,s)

  35. Local update of pheromone where is the initial pheromone level Global update of pheromone where globally best tour

  36. 3. Particle Swarm Optimization – Kennedy and Eberhart 1995 • Population initialized by assigning random positionsand velocities; potential solutions are then flown through hyperspace. • Each particle keeps track of its “best” (highest fitness) position in hyperspace. • This is called “pBest” for an individual particle. • It is called “gBest” for the best in the population. • At each time step, each particle stochastically accelerates toward its pBest and gBest.

  37. Particle Swarm Optimization Process • Initialize population in hyperspace. • Includes position and velocity. • Evaluate fitness of individual particle. • Modify velocities based on personal best and global best. • Terminate on some condition. • Go to step 2.

  38. Initialization:- Positions and velocities

  39. Modify velocities - based on personal best and global best. Here I am, now! pBest gBest xt xt+1 v New Position !

  40. Particle Swarm Optimization Modification of velocities and positions vt+1 =vt + c1 * rand() * (pBest - xt ) + c2 * rand() * (gBest - xt ) xt+1 = xt + vt+1

  41. VI. Multiple Trajectory Search (MTS) • MTS begins with multiple initial solutions, so multiple trajectories will be produced in the search process. • Initial solutions will be classified as foreground solutions and background solutions, and the search will be focused mainly on foreground solutions.

  42. The search neighborhood begins with a large-scale neighborhood, then it contracts step by step until it reaches the pre-defined tiny size. At this time, it is reset to its original size. • Three candidate local search methods were designed. When we want to search the neighborhood of a solution, they will be tested first in order we can choose the one that best fits the landscape of the solution.

  43. The MTS focuses its search mainly on foreground solutions. It does an iterated local search using one of three candidate local search methods. By choosing a local search method that best fits the landscape of a solution’s neighborhood, the MTS may find its way to a local optimum or the global optimum

  44. Step 1 : Find M initial solutions that are uniformly distributed over the solution space. • Step 2 : Iteratively apply local search to all these M initial solutions. • Step 3 : Choose K foreground solutions based on the grades of the solutions. • Step 4: Iteratively apply local search to K foreground solutions. • Step 5 : If termination condition is met then stop else go to Step 3.

  45. Three Local Search Methods Three local search methods begin their search in a very large “neighborhood”. Then the neighborhood contracts step by step until it reaches a pre-defined tiny size, after then, it is reset to its original size.

  46. Three Local Search Methods Local Search 1This local search method searches along each of n dimensions in a randomly permuted order. Local Search 2This local search method searches along a direction by considering about n/4 dimensions.

  47. Three Local Search Methods Local Search 3This local search method also searches along each of n dimension in a randomly permuted order, but it evaluate multiple solutions along each dimension within a pre-specified range.

  48. Multiple Trajectory Search for Large Scale Global Optimization Winner of 2008 IEEE World Congress on Computational Intelligence Competition on Large Scale Global Optimization

More Related