1 / 52

LINEAR PROGRAMMING

LINEAR PROGRAMMING. These visual aids will assist in class participation for the section on Linear Programming. They have no new information and less detail than the Word Document. Concepts and geometric interpretations Basic assumptions and limitations Post-optimal, results analysis

Olivia
Download Presentation

LINEAR PROGRAMMING

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LINEAR PROGRAMMING These visual aids will assist in class participation for the section on Linear Programming. They have no new information and less detail than the Word Document. • Concepts and geometric interpretations • Basic assumptions and limitations • Post-optimal, results analysis • Model formulations • (Details in Notes, textbook, and references)

  2. LINEAR PROGRAMMING Multiple choice question: Why study Linear Programming? 1. The professor cannot formulate N-L models 2. Linear programming is the most frequently used optimization method. 3. Linear programming was developed in the 1940’s 4. Formulation and solution knowledge will also help when addressing non-linear models 5. Excellent software is available

  3. LINEAR PROGRAMMING Models must conform to these restrictions • Linearity • Divisibility • Certainty

  4. LINEAR PROGRAMMING Models must conform to these restrictions • Linearity • Divisibility • Certainty

  5. LINEAR PROGRAMMING Models must conform to these restrictions • Linearity • Divisibility • Certainty

  6. LINEAR PROGRAMMING Linear programming is formulated using the general problem statement. How do we satisfy the modelling limitations? Objective function Equality constraints Inequality constraints Variable Bounds

  7. LINEAR PROGRAMMING Linear programming is formulated using the general problem statement. We must learn some formulation “tricks”. Objective function Equality constraints Inequality constraints Variable Bounds

  8. LINEAR PROGRAMMING Linear programming requires inequality constraints. Remember that they require the lhs (lefthand side) to be less than or greater than or equal to the rhs (right hand side).

  9. LINEAR PROGRAMMING Corner Point: A solution for a linear program is a corner point that is located at the intersection of inequality constraints. A point is a corner point (p) if every line segment in the set (feasible region) containing p has p as an endpoint. When explaining linear programming, various references use the following terms, all having the same meaning: corner point, extreme point, and vertex.

  10. LINEAR PROGRAMMING The best corner-point solution must be the optimal value of the objective function! Thus,if the problem has one optimal solution, it must be a corner point (vertex); if it has multiple optimal solutions, at least two must be located at corner points (vertices).

  11. LINEAR PROGRAMMING Linear programming is a convex optimization problem; therefore, in linear programming a local optimum is a global optimum!

  12. LINEAR PROGRAMMING Using the Corner Point concept as the foundation for a solution method. Let’s just calculate the objective function value at all corner points and select the best. With 20 variables and ten constraints, 185,000 corner points exist! And, that is a small problem. Back to the drawing board!

  13. LINEAR PROGRAMMING Using the Corner Point concept as the foundation for a solution method. We want to convert this general formulation to a system of linear equations - we know how to solve these!

  14. LINEAR PROGRAMMING Using the Corner Point concept as the foundation for a solution method. We will add “slack” variables to all inequalities to convert them to equalities. What do the slack variables measure?

  15. LINEAR PROGRAMMING Using the Corner Point concept as the foundation for a solution method. We have the LP problem in “Standard Form”. Note that the system has more variables than equations.

  16. LINEAR PROGRAMMING

  17. LINEAR PROGRAMMING

  18. LINEAR PROGRAMMING Using the Corner Point concept as the foundation for a solution algorithm - the Simplex Tableau. The Simplex method moves for among feasible corner points to find the best. But, it requires an initial feasible corner point. The “Big-M” method adds artificial variables. Po = c1x1 + …. + cnxn the original objective Pa = c1x1 + …. + cnxn + Mxa1 + ….+Mxam the modified objective The artificial variables get us started, but they must not appear in the optimal solution.

  19. LINEAR PROGRAMMING The artificial variables give us an initial solution in canonical form.

  20. LINEAR PROGRAMMING Using the Corner Point concept as the foundation for a solution method - the Simplex Algorithm. Start here 1. Are we optimal; if yes, we are done 2. If not optimum, move along an edge to an adjacent feasible corner point. Select the corner point that has the largest rate of change of the objective.

  21. LINEAR PROGRAMMING Using the Corner Point concept as the foundation for a solution method - the Simplex Algorithm. Start here 1. Are we optimal; 2. Move along an edge to an adjacent corner point. 3. This adds one variable to the basis and removes another variable from the basis.

  22. Iteration 1 Iteration 3 Iteration 2 LINEAR PROGRAMMING Using the Corner Point concept as the foundation for a solution method - the Simplex Algorithm.

  23. LINEAR PROGRAMMING The LP Simplex Algorithm Extensions • Variables can be positive, zero, or negative • x = x’ – x’’ with x’  0 and x’’  0 • Variables can have upper bounds • (We can do this without adding constraints) • Efficient Calculations • - Do not recalculate a big matrix every time • - Restart from a previous solution

  24. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • NO FEASIBLE SOLUTION • Diagnosis - • Remedial Action -

  25. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • NO FEASIBLE SOLUTION • Diagnosis - At least one artificial variable in optimal basis - software reports this as infeasible. • Remedial Action - reformulate, if appropriate

  26. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • UNBOUNDED SOLUTION • Diagnosis - • Remedial Action -

  27. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • UNBOUNDED SOLUTION • Diagnosis - The distance to the next corner point is infinity - software will report. • Remedial Action - Reformulate, which is always possible - realistic variables never go to 

  28. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • ALTERNATIVE OPTIMA • Diagnosis - • Remedial Action -

  29. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • ALTERNATIVE OPTIMA • Diagnosis 1 - The basis can change with no change in objective. • One or more non-basic variables has a zero marginal cost. • Software does not report warning

  30. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • ALTERNATIVE OPTIMA • Diagnosis 2 - One or more active constraint rhs can be changed without affecting the objective. • An active constraint has a zero marginal value. • Software does not report warning Constraint rhs can be changed with no change to OBJ

  31. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • ALTERNATIVE OPTIMA • Remedial action- We have found the best value of the objective function! • We likely prefer one of the different sets of x values. We would like to know all solutions and select the “best”, using other criteria.

  32. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • CONSTRAINT • DEGENERACY • Diagnosis - • Remedial Action -

  33. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • CONSTRAINT • DEGENERACY • Diagnosis - The marginal value of an active constraint rhs depends upon the direction of change. • The range of the (non-zero) marginal value is 0.0 in one direction.

  34. LINEAR PROGRAMMING The LP Simplex Algorithm Weird Events • CONSTRAINT • DEGENERACY • Remedial action- The solution is correct. • The sensitivity information is not reliable! • If you need sensitivity information, introduce the change (rhs, cost, etc.) and rerun the optimization.

  35. LINEAR PROGRAMMING The LP Simplex Algorithm - Great News - Sensitivity We always want to answer “what if” questions. Much valuable information is available with the optimal solution! 1. Sensitivity = z/ with all x = constant 2. Sensitivity = z/ with all basic variables, xB, allowed to change so the results represent an optimal solution, i.e., corner point, for the modified problem. The sensitivity is reported using the optimal basis and evaluates the range and effects of parameter changes within the basis, i.e., without requiring a basis change.

  36. LINEAR PROGRAMMING The LP Simplex Algorithm - Great News - Sensitivity Change to rhs, e.g., max production rate, min flow, etc. One-at-a-time changes to a rhs parameter 0 Resulting change in basic variables; non-basic are not changed if bi small

  37. LINEAR PROGRAMMING The LP Simplex Algorithm - Great News - Sensitivity One-at-a-time changes to a rhs parameter How much can we change constraint 1 without changing the basis?

  38. LINEAR PROGRAMMING The LP Simplex Algorithm - Great News - Sensitivity One-at-a-time changes to a rhs parameter - Software products report the value and range of the sensitivity for every constraint! Outside of the range, the basis changes; what do we do? Remember to check for constraint degeneracy! Check the units!!!!!

  39. LINEAR PROGRAMMING The LP Simplex Algorithm - Great News - Sensitivity Sensitivity to one change in cost - basic variable

  40. LINEAR PROGRAMMING Formulating Models for LPs

  41. LINEAR PROGRAMMING Formulating Models for LPs Straightforward model represents effects of dominant variables and ignores all other effects.

  42. LINEAR PROGRAMMING Formulating Models for LPs Base-Delta models include the effects of secondary variables over a limited range Fi =  (F) +  (T-T0) +  (P-P0)  is the “yield” at T0 and P0

  43. LINEAR PROGRAMMING Formulating Models for LPs Disjunctive Programming provides alternative models for the same system.

  44. LINEAR PROGRAMMING Formulating Models for LPs Separable Programming extends the range over which the linear models are accurate by using piecewise linear models.

  45. LINEAR PROGRAMMING Formulating Models for LPs Goal Programming provides a method for “getting close” to desired conditions that are not possible. This is a penalty function. We want to achieve three properties and the total flow by mixing two flows. What do we do?

  46. LINEAR PROGRAMMING Formulating Models for LPs Goal Programming Extra variables are added to ensure “math” feasibility; they measure the amount of infeasibility related to the original problem. They are penalized to minimize infeasibility.

  47. LINEAR PROGRAMMING Formulating Models for LPs Linearizing Transformations - Some properties can be transformed to behave linearly.

  48. LINEAR PROGRAMMING Formulating Models for LPs Flow-property relationships- When properties combine linearly and the properties do not depend on the variables (flows), a special formulation can be used. Non-linear Linear

  49. LINEAR PROGRAMMING Formulating Models for LPs Mini-Max Problem- Often, we have a range of outcomes possible from alternative systems. These could systems could be - Alternative decisions (e.g., investments) - possible outcomes from an uncertain system Here, we would like to consider every system and minimize the worst case, i.e., the maximum of the minimums.

  50. LINEAR PROGRAMMING Formulating Models for LPs Mini-Max Problem- Minimize the worst case, i.e., the maximum of the minimums from all systems. fi = a set of linear equations and inequalities with the parameters i yielding an objective function fi i = The parameters associated with outcome I x = The optimization variables, which are used in every model I

More Related