760 likes | 1.03k Views
Emory University. Inexact Methods for PDE-Constrained Optimization. Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal February 12, 2007. Nonlinear Optimization. “One” problem. Circuit Tuning. Building blocks:
E N D
Emory University Inexact Methods for PDE-Constrained Optimization Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal February 12, 2007
Nonlinear Optimization • “One” problem
Circuit Tuning • Building blocks: • Transistors (switches) and Gates (logic units) • Improve aspects of the circuit – speed, area, power – by choosing transistor widths w1 w2 AT1 d1 AT3 AT2 d2 (A. Wächter, C. Visweswariah, and A. R. Conn, 2005)
Circuit Tuning • Building blocks: • Transistors (switches) and Gates (logic units) • Improve aspects of the circuit – speed, area, power – by choosing transistor widths w1 w2 AT1 d1 AT3 AT2 d2 • Formulate an optimization problem (A. Wächter, C. Visweswariah, and A. R. Conn, 2005)
Strategic Bidding in Electricity Markets • Independent operator collects bids and sets production schedule and “spot price” to minimize cost to consumers (Pereira, Granville, Dix, and Barroso, 2004)
Strategic Bidding in Electricity Markets • Independent operator collects bids and sets production schedule and “spot price” to minimize cost to consumers • Electricity production companies “bid” on how much they will charge for one unit of electricity (Pereira, Granville, Dix, and Barroso, 2004)
Strategic Bidding in Electricity Markets • Independent operator collects bids and sets production schedule and “spot price” to minimize cost to consumers • Electricity production companies “bid” on how much they will charge for one unit of electricity • Bilevel problem • Equivalent to MPCC • Hard geometry! (Pereira, Granville, Dix, and Barroso, 2004)
Challenges for NLP algorithms • Very large problems • Numerical noise • Availability of derivatives • Degeneracies • Difficult geometries • Expensive function evaluations • Real-time solutions needed • Integer variables • Negative curvature
Outline • Problem Formulation • Equality constrained optimization • Sequential Quadratic Programming • Inexact Framework • Unconstrained optimization and nonlinear equations • Stopping conditions for linear solver • Global Behavior • Merit function and sufficient decrease • Satisfying first order conditions • Numerical Results • Model inverse problem • Accuracy tradeoffs • Final Remarks • Future work • Negative curvature
Outline • Problem Formulation • Equality constrained optimization • Sequential Quadratic Programming • Inexact Framework • Unconstrained optimization and nonlinear equations • Stopping conditions for linear solver • Global Behavior • Merit function and sufficient decrease • Satisfying first order conditions • Numerical Results • Model inverse problem • Accuracy tradeoffs • Final Remarks • Future work • Negative curvature
Equality constrained optimization Goal: solve the problem e.g., minimize the difference between observed and expected behavior, subject to atmospheric flow equations (Navier-Stokes)
Equality constrained optimization Goal: solve the problem Define: the derivatives Define: the Lagrangian Goal: solve KKT conditions
Sequential Quadratic Programming (SQP) • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem
Sequential Quadratic Programming (SQP) • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem • KKT matrix • Cannot be formed • Cannot be factored
Sequential Quadratic Programming (SQP) • Two “equivalent” step computation techniques Algorithm: Newton’s method Algorithm: the SQP subproblem • KKT matrix • Cannot be formed • Cannot be factored • Linear system solve • Iterative method • Inexactness
Outline • Problem Formulation • Equality constrained optimization • Sequential Quadratic Programming • Inexact Framework • Unconstrained optimization and nonlinear equations • Stopping conditions for linear solver • Global Behavior • Merit function and sufficient decrease • Satisfying first order conditions • Numerical Results • Model inverse problem • Accuracy tradeoffs • Final Remarks • Future work • Negative curvature
Unconstrained optimization Goal: minimize a nonlinear objective Algorithm: Newton’s method (CG)
Unconstrained optimization Goal: minimize a nonlinear objective Algorithm: Newton’s method (CG) Note: choosing any intermediate step ensures global convergence to a local solution of NLP (Steihaug, 1983)
Nonlinear equations Goal: solve a nonlinear system Algorithm: Newton’s method
Nonlinear equations Goal: solve a nonlinear system Algorithm: Newton’s method any step with and ensures descent (Dembo, Eisenstat, and Steihaug, 1982) (Eisenstat and Walker, 1994)
Line Search SQP Framework • Define “exact” penalty function
Line Search SQP Framework • Define “exact” penalty function
Algorithm Outline (exact steps) • for k = 0, 1, 2, … • Compute step by… • Set penalty parameter to ensure descent on… • Perform backtracking line search to satisfy… • Update iterate
Exact Case Exact step minimizes the objective on the linearized constraints
Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective
Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step
Quadratic/linear model of merit function • Create model • Quantify reduction obtained from step
Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective
Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective … but this is ok since we can account for this conflict by increasing the penalty parameter
Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective … but this is ok since we can account for this conflict by increasing the penalty parameter
Algorithm Outline (exact steps) • for k = 0, 1, 2, … • Compute step by… • Set penalty parameter to ensure descent on… • Perform backtracking line search to satisfy… • Update iterate
First attempt • Proposition: sufficiently small residual • Test: 61 problems from CUTEr test set
First attempt… not robust • Proposition: sufficiently small residual • … not enough for complete robustness • We have multiple goals (feasibility and optimality) • Lagrange multipliers may be completely off • … may not have descent!
Second attempt • Step computation: inexact SQP step • Recall the line search condition • We can show
Second attempt • Step computation: inexact SQP step • Recall the line search condition • We can show ... but how negative should this be?
Algorithm Outline (exact steps) • for k = 0, 1, 2, … • Compute step • Set penalty parameter to ensure descent • Perform backtracking line search • Update iterate
Algorithm Outline (inexact steps) • for k = 0, 1, 2, … • Compute step and set penalty parameter to ensure descent and a stable algorithm • Perform backtracking line search • Update iterate
Inexact Case Step is acceptable if for
Inexact Case Step is acceptable if for
Inexact Case Step is acceptable if for
Algorithm Outline • for k = 0, 1, 2, … • Iteratively solve • Until • Update penalty parameter • Perform backtracking line search • Update iterate or
Termination Test • Observe KKT conditions
Outline • Problem Formulation • Equality constrained optimization • Sequential Quadratic Programming • Inexact Framework • Unconstrained optimization and nonlinear equations • Stopping conditions for linear solver • Global Behavior • Merit function and sufficient decrease • Satisfying first order conditions • Numerical Results • Model inverse problem • Accuracy tradeoffs • Final Remarks • Future work • Negative curvature
Assumptions • The sequence of iterates is contained in a convex set and the following conditions hold: • the objective and constraint functions and their first and second derivatives are bounded • the multiplier estimates are bounded • the constraint Jacobians have full row rank and their smallest singular values are bounded below by a positive constant • the Hessian of the Lagrangian is positive definite with smallest eigenvalue bounded below by a positive constant
Sufficient Reduction to Sufficient Decrease • Taylor expansion of merit function yields • Accepted step satisfies
Intermediate Results is bounded above is bounded above is bounded below by a positive constant