1 / 15

OUTLINE

OUTLINE. Questions? Did you hear or notice anything since last class that has to do something with this class? Quiz results Continue scheduling with heuristics Last Quiz on 4/28. Quiz Results. Theory of constraints or OPT procedure.

moses
Download Presentation

OUTLINE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OUTLINE • Questions? • Did you hear or notice anything since last class that has to do something with this class? • Quiz results • Continue scheduling with heuristics • Last Quiz on 4/28

  2. Quiz Results

  3. Theory of constraints or OPT procedure • Basis - schedule the bottleneck, then everything around it • 1. Determine the bottleneck • 2. Schedule the bottleneck • 3. Schedule back from the bottleneck • 4. Schedule forward from the bottleneck

  4. Theory of constraints or OPT procedure - hypotheses • 1. Works best when there is a single strong and stable bottleneck • 3. Myopic is poorest when local priorities are different from a strong bottleneck’s priorities

  5. Monte Carlo Technique • Simulating generating a schedule many times, each time making the choices selected at random • Generate the distribution of the performance measure • We can then make a statement regarding the probability of a given performance measure if the schedule is generated randomly • We can also save the best schedule for use

  6. Weighted Random Selection • This is best explained by an example: • Suppose we have decided to use four different dispatching rules. • We now select a weight for each, adding up to 1. For example: • SPT - 0.3 • EDD - 0.4 • LWKR - 0.2 • LOPR - 0.1 • At each choice, we generate a random number between 0 and 1 and use the rule obtained by:

  7. Weighted Random Selection (continued) • Random number between Use Rule • 0 and 0.400 EDD • 0.401 and 0.700 SPT • 0.701 and 0.900 LWKR • 0.901 and 0.999 LOPR • We need only generate one schedule. • However, we can use it multiple times to determine the distribution of the performance measure

  8. Neighborhood Searches • A very common heuristic procedure proceeds as follows: • 1. Find a schedule by whatever means - random, modified Johnson, active, non-delay • 2. Calculate the performance measure • 3. Vary the original schedule in a systematic manner (explore the “neighborhood”)* • 4. Recalculate the performance measure and keep the better schedule • *from Pinedo:”Two schedules are neighbors if one can be obtained through a well defined modification of the other” (see pages 345-353,427,492)

  9. Neighborhood Searches (continued) • 5. Continue the process until: • a. You have no more time • b. No better schedule is produced • c. You have exhausted the possibilities of your approach • Needless to say, you can select a great variety of approaches to defining what the “neighborhood” is

  10. Neighborhood Searches (continued) • One of the simple approaches is to use a pair wise exchange • Suppose we have a 4/1//R problem with no known algorithm • Start with a random sequence, e.g., 1324 • Let’s use what is called a single adjacent pair wise exchange • Then the neighborhood consists of: • 3124 1234 1342 • Suppose that the last of these is better than 1324

  11. Neighborhood Searches (continued) • We then explore the neighborhood of 1342: • 3142 1432 1324 etc.

  12. Genetic algorithms • Simulate natural evolution process • We start with a population - a set of schedules • We keep the size of the population constant • We generate an “offspring” for each member of the population - some type of exchange • We select the best of the offspring and replace the worst of the previous population with it • We keep repeating until we do not get an improvement

  13. Genetic algorithms - continued • An example of a Tbar problem: • Let’s use a population size of 3 with a seed of these three schedules (Sum of T is in parentheses)(total in [ ]): • Generation 1: 123456(27), 132456(27), 312456(28) [82] • We create offspring by selecting a random number between 1 and 5 and do a pairwise exchange at that position • Our first three random numbers: 4, 5, 5

  14. Genetic algorithms - continued • First set of offspring: 123546(26), 132465(26), 312465(27) • We select the first or the second at random (#2) and use it to replace the third member of generation 1 • Generation 2: 123456(27), 132456(27), 132465(26) [80] • Our second three random numbers: 4, 2, 2 • Second set of offspring: 123546(26), 123456(27), 123465(26) • We replace the (randomly chosen from the first and second) member of generation 2 with the first offspring from the second set (at random between 1 and 3) • Generation 3: 123456(27), 123546(26), 132465(26) [79]

  15. Genetic algorithms - continued • Our third three random numbers: 5, 3, 1 • Second set of offspring: 123465(26), 124365(25), 312465(27) • Number 2 replaces number 1 • Generation 4: 124365(25), 123465(26), 132465(26) [77] • Our fourth three random numbers: 3, 4, 2 • Third set of offspring: 123465(26), 123645(26), 123465(26) • None are better than our population • We stop with 124365(25) • Notice that our population as a whole kept improving

More Related