1 / 12

Optimization with Meta-Heuristics

Optimization with Meta-Heuristics. Question: Can you ever prove that a solution generated using a meta-heuristic is optimal? Answer: Yes, for a minimization problem, if the value of the solution equals a lower bound.

Download Presentation

Optimization with Meta-Heuristics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization with Meta-Heuristics Question: Can you ever prove that a solution generated using a meta-heuristic is optimal? Answer: Yes, for a minimization problem, if the value of the solution equals a lower bound. Question: If the solution of a meta-heuristic for a minimization problem does not equal the lower bound, does that mean the solution is not optimal? Answer: Not necessarily, you just don’t know. Observation: Developing a good lower bound just as important as developing a good meta-heuristic algorithm.

  2. DOE for Meta-Heuristics Question: With all the seeming randomness, and choices of neighborhoods and algorithm parameters, how do you know you have developed a good approach or not? Answer: Design of Experiments

  3. DOE for Meta-Heuristics Recall the classic Johnson, et al. simulated annealing algorithm: 1. Get an initial solution S. 2. Get an initial temperature T > 0. 3. While not yet frozen do the following: 3.1 Perform the following loop l time. 3.1.1 Pick a random neighbor S’ of S. 3.1.2 Let D = cost(S’) – cost(S) 3.1.3 If D <= 0 (downhill move), Set S = S’. 3.1.4 If D > 0 (uphill move), Set S = S’ with probability e-D/T. 3.2 Set T = rT (reduce temperature). 4. Return S. What are potential experimental parameters?

  4. DOE for Meta-Heuristics Design parameters for simulated annealing algorithm include: • Problem instances • Cooling approach • Starting temperature • Number of iterations at temperature • Temperature reduction rate • Termination condition • Variance from Johnson’s classic algorithm • Neighborhoods • Acceptance probability function

  5. DOE for Meta-Heuristics Obtaining Problem Instances: • Benchmark problems • www.palette.ecn.purdue.edu/~uzsoy2/spssm.html • http://w.cba.neu.edu/~msolomon/problems.htm • many others • Problem generator • How many problems • Size of problem • Problem characteristics (unique for different problem types)

  6. www.palette.ecn.purdue.edu/~uzsoy2/spssm.html Shop Scheduling Benchmark Problems • General Information • C Programs For Problem Generation • Parameter Values For Problem Generation • J//Cmax Problems • J//Lmax Problems • J/2SETS/Cmax Problems • J/2SETS/Lmax Problems • F//Cmax Problems • F//Lmax Problems

  7. DOE for Meta-Heuristics How to report results: • Must evaluate to something (solution value – lower bound) • Compare solution versus run time • Compare over some problem generation parameter (due date range)

  8. DOE for Meta-Heuristics How to report results: different sized problem instances

  9. DOE for Meta-Heuristics How to report results: Comparison to benchmark problems

  10. DOE for Meta-Heuristics How to report results:

  11. DOE for Meta-Heuristics How to report results:

  12. DOE for Meta-Heuristics In class assignment: Develop a DOE for the traveling salesman problem.

More Related