minlp: theory, algorithms, applications: lecture 3, basics of … · 2016. 6. 30. · minlp:...

60
MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Jeff Linderoth Industrial and Systems Engineering University of Wisconsin-Madison Jonas Schweiger Friedrich-Alexander-Universit¨ at Erlangen-N¨ urnberg June 28, 2016 Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 1 / 36

Upload: others

Post on 15-Sep-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

MINLP: Theory, Algorithms, Applications:Lecture 3, Basics of Algorothms

Jeff LinderothIndustrial and Systems Engineering

University of Wisconsin-Madison

Jonas SchweigerFriedrich-Alexander-Universitat Erlangen-Nurnberg

June 28, 2016

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 1 / 36

Page 2: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Outline

Algorithms for solving convex MINLP problems...

NLP branch-and-bound

Linearization-based methods

Outer approximationLP/NLP branch-and-boundExtended cutting plane (ECP) algorithm

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 2 / 36

Page 3: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex

Resetting Notation

zminlp = minx

f(x)

s.t gi(x) ≤ 0, i = 1, . . . ,m

x ∈ X, xI ∈ Z|I|

We will sometimes switch to a “vector” notation as denotegi(x) ≤ 0, i = 1, . . . ,m simply as g(x) ≤ 0.

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 3 / 36

Page 4: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex

Resetting Notation

zminlp = minx

f(x)

s.t gi(x) ≤ 0, i = 1, . . . ,m

x ∈ X, xI ∈ Z|I|

We will sometimes switch to a “vector” notation as denotegi(x) ≤ 0, i = 1, . . . ,m simply as g(x) ≤ 0.

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 3 / 36

Page 5: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Natural Relaxation (Convex MINLP)

For a convex MINLP

(x1 − 2)2 + (x2 + 1)2 ≤ 36

3x1 − x2 ≤ 6

0 ≤ x1, x2 ≤ 5

x2 ∈ Z

Dropping integrality results in aconvex, nonlinear relaxation

Ideal relaxation is convex hull offeasible points

Optimizing linear function overthis convex set solves theproblem!

x1

x2

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 4 / 36

Page 6: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Natural Relaxation (Convex MINLP)

For a convex MINLP

(x1 − 2)2 + (x2 + 1)2 ≤ 36

3x1 − x2 ≤ 6

0 ≤ x1, x2 ≤ 5

x2 ∈ R

Dropping integrality results in aconvex, nonlinear relaxation

Ideal relaxation is convex hull offeasible points

Optimizing linear function overthis convex set solves theproblem!

x1

x2

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 4 / 36

Page 7: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Natural Relaxation (Convex MINLP)

For a convex MINLP

(x1 − 2)2 + (x2 + 1)2 ≤ 36

3x1 − x2 ≤ 6

0 ≤ x1, x2 ≤ 5

x2 ∈ R

Dropping integrality results in aconvex, nonlinear relaxation

Ideal relaxation is convex hull offeasible points

Optimizing linear function overthis convex set solves theproblem!

x1

x2

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 4 / 36

Page 8: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Nonlinear Branch and Bound

Solve NLP relaxation: (Relax xI ∈ Z|I|).

min f(x) s.t. gi(x) ≤ 0 ∀i ∈ [m] x ∈ X

If f(·), g(·) are convex (differentiable), then there exist solvers that(should) find the (globally) optimal solution to the relaxation

If xi ∈ Z ∀ i ∈ I, then solved MINLP

If relaxation is infeasible, then MINLP infeasible

Otherwise (recursively) search tree whose nodes are NLPs:minx

f(x),

s.t. g(x) ≤ 0,x ∈ X,li ≤ xi ≤ ui, ∀i ∈ I.

(NLP(l, u))

NLP relaxation is NLP(−∞,∞)

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 5 / 36

Page 9: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Nonlinear Branch and Bound

Solve NLP relaxation: (Relax xI ∈ Z|I|).

min f(x) s.t. gi(x) ≤ 0 ∀i ∈ [m] x ∈ X

If f(·), g(·) are convex (differentiable), then there exist solvers that(should) find the (globally) optimal solution to the relaxation

If xi ∈ Z ∀ i ∈ I, then solved MINLP

If relaxation is infeasible, then MINLP infeasible

Otherwise (recursively) search tree whose nodes are NLPs:minx

f(x),

s.t. g(x) ≤ 0,x ∈ X,li ≤ xi ≤ ui, ∀i ∈ I.

(NLP(l, u))

NLP relaxation is NLP(−∞,∞)

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 5 / 36

Page 10: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Branching

Solution x′ of (NLP(l, u)) feasible but not integral:

Find a nonintegral variable, say x′i, i ∈ I.Introduce two child nodes with bounds (l−, u−) = (l+, u+) = (l, u)and setting:

u−i := bx′ic, and l+i := dx′ie

⇒ create two problems NLP(l−, u−), NLP(l+, u+) (down/up branch)

Pruning Rules

1 (NLP(l, u)) infeasible ⇒ NLPs in subtree also infeasible

2 Integer feasible solution x(l,u) of (NLP(l, u)):

If f(x(l,u)) < U , then new x∗ = x(l,u) and U = f (l,u).prune node no better solution in subtree

3 Optimal value of (NLP(l, u)), f(x(l,u)) ≥ U⇒ prune node: no better integer solution in subtree

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 6 / 36

Page 11: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Branching

Solution x′ of (NLP(l, u)) feasible but not integral:

Find a nonintegral variable, say x′i, i ∈ I.Introduce two child nodes with bounds (l−, u−) = (l+, u+) = (l, u)and setting:

u−i := bx′ic, and l+i := dx′ie

⇒ create two problems NLP(l−, u−), NLP(l+, u+) (down/up branch)

Pruning Rules

1 (NLP(l, u)) infeasible ⇒ NLPs in subtree also infeasible

2 Integer feasible solution x(l,u) of (NLP(l, u)):

If f(x(l,u)) < U , then new x∗ = x(l,u) and U = f (l,u).prune node no better solution in subtree

3 Optimal value of (NLP(l, u)), f(x(l,u)) ≥ U⇒ prune node: no better integer solution in subtree

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 6 / 36

Page 12: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Nonlinear Branch and Bound

Solve relaxed NLP (0 ≤ x ≤ 1) . . . solution gives lower bound

1 Solve NLPs & branch on xi until

2 Node infeasible: •3 Node integer feasible: �⇒ get upper bound (U)

4 Lower bound ≥ U :

Search until no unexplored nodes

It Works Theorem

Assume that:

X bounded polyhedral set;

NLP solver returns global min.

⇒ BnB terminates at optimal solution

infeasible

integer

feasibleUBD

dominated

by UBD

x=0 x=1

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 7 / 36

Page 13: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Nonlinear Branch and Bound

Solve relaxed NLP (0 ≤ x ≤ 1) . . . solution gives lower bound

1 Solve NLPs & branch on xi until

2 Node infeasible: •

3 Node integer feasible: �⇒ get upper bound (U)

4 Lower bound ≥ U :

Search until no unexplored nodes

It Works Theorem

Assume that:

X bounded polyhedral set;

NLP solver returns global min.

⇒ BnB terminates at optimal solution

infeasible

integer

feasibleUBD

dominated

by UBD

x=0 x=1

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 7 / 36

Page 14: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Nonlinear Branch and Bound

Solve relaxed NLP (0 ≤ x ≤ 1) . . . solution gives lower bound

1 Solve NLPs & branch on xi until

2 Node infeasible: •3 Node integer feasible: �⇒ get upper bound (U)

4 Lower bound ≥ U :

Search until no unexplored nodes

It Works Theorem

Assume that:

X bounded polyhedral set;

NLP solver returns global min.

⇒ BnB terminates at optimal solution

infeasible

integer

feasibleUBD

dominated

by UBD

x=0 x=1

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 7 / 36

Page 15: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Nonlinear Branch and Bound

Solve relaxed NLP (0 ≤ x ≤ 1) . . . solution gives lower bound

1 Solve NLPs & branch on xi until

2 Node infeasible: •3 Node integer feasible: �⇒ get upper bound (U)

4 Lower bound ≥ U :

Search until no unexplored nodes

It Works Theorem

Assume that:

X bounded polyhedral set;

NLP solver returns global min.

⇒ BnB terminates at optimal solution

feasibleUBD

dominated

by UBD

x=0 x=1

infeasible

integer

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 7 / 36

Page 16: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Branch and Bound (1 of 2)

1 Select an NLP Relaxation

2 Solve the selected NLP relaxation. If the NLP relaxation is infeasible,Go to 1.

3 Otherwise, let x∗ be the optimal solution to the NLP relaxation andlet z∗ be its objective value.

4 If x∗ satisfies integer restrictions (x∗I ∈ ZI), then z∗ is the optimalvalue for the restricted IP problem. If z∗ < zU , then zU := z∗. (Alsokeep track of x∗ as candidate solution). Go to 1.

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 8 / 36

Page 17: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex NLP-Based Branch and Bound

Branch and Bound (2 of 2)

5 Otherwise, x∗ does not satisfy the integer restrictions. z∗ is lowerbound on the optimal value of the restricted IP problem.

6 If z∗ ≥ zU , (Fathom) Go to 1. (There is no way that an optimalsolution can be in this restricted region).

7 Otherwise, Divide. Choose some index k ∈ I such that xk does notsatsify the integer restriction (x∗k 6∈ Z). Add two new restrictedproblems to the list and Go to 1.

1 Include constraint xk ≤ bx∗kc2 Include constraint xk ≥ dx∗ke

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 9 / 36

Page 18: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex Performance Guarantees

Bounds and performance guarantees

We saw in the description of the algorithm how to get an upperbound zU on the optimal objective function value

We can also get an lower bound on the optimal objective value. zL isthe minimum of all of the lower bounds of all of the remainingrestriced problems that must be evaluated

If we only want to solve a problem to within α% of optimality, wecan!!!

This is really what is great about optimization. Not only do you get asolution, but you can tell your boss that if there is a better solution itcan be at most α% better.

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 10 / 36

Page 19: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex Performance Guarantees

GAMS and Tolerances

option optcr = 0.0

option optca = 0.0

optcr: Stop if (zU − zL)/max{|zL|, 1} ≤ optcr

optca: Stop if (zU − zL)/ ≤ optca

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 11 / 36

Page 20: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex Performance Guarantees

Implementations of NLP-based B&B (GAMS)

BONMIN, KNITRO, SBB

Bonmin also has other algorithms – check manual

CPLEX, Gurobi, XPRESS (only for quadratic)

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 12 / 36

Page 21: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex Performance Guarantees

Setting Options

To change the solver, use option <problemtype>=<solvername>;

e.g.: option minlp=bonmin;

Each solver has lots of options that might help improve solverperformance. See documentation!

(Mac: /Applications/GAMS24.7/sysdir/docs/solvers)

Options are mostly for advanced users, but can improve performance,especially for MINLP

To ensure GAMS reads options put line modelname.optfile=1;

after the model statement but before the codesolve (wheremodelname is the name of your model)

Then build a file in the GAMS project directory, called<solvername>.opt which lists the option names followed by thechosen value.

A “simple”/self-documenting way to create options files is will theGAMS ($onecho, $offecho directives)

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 13 / 36

Page 22: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex Performance Guarantees

Example GAMS Setting Options

option miqcp = cplex

m1.optfile = 1;

* This sets the MIQCP algorithm in cplex to NLP-BB,

* and will print log output every 10 nodes of BB tree

$onecho > cplex.opt

miqcpstrat 1

mipinterval 10

$offecho

solve m1 using miqcp minimizing cost;

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 14 / 36

Page 23: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex QG Algorithm Background

Linearization Based Methods

For Convex MINLP

(x1 − 2)2 + (x2 + 1)2 ≤ 36

3x1 − x2 ≤ 6

0 ≤ x1, x2 ≤ 5

x2 ∈ Z

“Natural” relaxation—droppingintegrality results is nonlinear

If you don’t like NLP (like me),then relax (convex) nonlinearconstraints with first orderapproximations

x1

x2

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 15 / 36

Page 24: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex QG Algorithm Background

Linearization Based Methods

For Convex MINLP

(x1 − 2)2 + (x2 + 1)2 ≤ 36

3x1 − x2 ≤ 6

0 ≤ x1, x2 ≤ 5

x2 ∈ R

“Natural” relaxation—droppingintegrality results is nonlinear

If you don’t like NLP (like me),then relax (convex) nonlinearconstraints with first orderapproximations

x1

x2

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 15 / 36

Page 25: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Algorithms – Convex QG Algorithm Background

Linearization Based Methods

For Convex MINLP

(x1 − 2)2 + (x2 + 1)2 ≤ 36

3x1 − x2 ≤ 6

0 ≤ x1, x2 ≤ 5

x2 ∈ R

“Natural” relaxation—droppingintegrality results is nonlinear

If you don’t like NLP (like me),then relax (convex) nonlinearconstraints with first orderapproximations x1

x2

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 15 / 36

Page 26: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

MILP Master Problem

Goal

Avoid solving NLP subproblems so frequently.

To do so, we’ll derive a MILP reformulation of MINLP

To simplify notation, we’ll assume the objective is linear (remember trickfrom last time)

zminlp = minx

cTx

s.t. gi(x) ≤ 0, i = 1, . . . ,m

x ∈ X, xI ∈ Z|I|

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 16 / 36

Page 27: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

MILP Master Problem

Goal

Avoid solving NLP subproblems so frequently.

To do so, we’ll derive a MILP reformulation of MINLP

To simplify notation, we’ll assume the objective is linear (remember trickfrom last time)

zminlp = minx

cTx

s.t. gi(x) ≤ 0, i = 1, . . . ,m

x ∈ X, xI ∈ Z|I|

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 16 / 36

Page 28: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

First step: MILP relaxation

Implication of convexity (over X)

For any x ∈ X, the following inequality holds:

gi(x) ≥ gi(x) +∇gi(x)T (x− x), ∀x ∈ X

Then, if K ⊆ X is a finite set, the following is an MILP relaxation:

zR(K) = minx

cTx

s.t. gi(x) +∇gi(x)T (x− x) ≤ 0, i = 1, . . . ,m, ∀x ∈ Kx ∈ X, xI ∈ Z|I|

Question

Does there exist a finite set K such that zR(K) = zminlp?

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 17 / 36

Page 29: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

A finite set of solutions

Define the following set and NLP problems:

K = {y ∈ Z|I| : ∃x ∈ X with xI = y}.

If X is bounded, then K is a finite set

NLP(y)

minx

cTx

s.t. gi(x) ≤ 0, i = 1, . . . ,m

x ∈ X, xI = y

FNLP(y)

minx‖xI − y‖

s.t. gi(x) ≤ 0, i = 1, . . . ,m

x ∈ X

For each y ∈ X, define xy by:

If NLP(y) is feasible: xy is an optimal solution of NLP(y)

Else: xy is an optimal solution of FNLP(y)

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 18 / 36

Page 30: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

A finite set of solutions

Define the following set and NLP problems:

K = {y ∈ Z|I| : ∃x ∈ X with xI = y}.

If X is bounded, then K is a finite set

NLP(y)

minx

cTx

s.t. gi(x) ≤ 0, i = 1, . . . ,m

x ∈ X, xI = y

FNLP(y)

minx‖xI − y‖

s.t. gi(x) ≤ 0, i = 1, . . . ,m

x ∈ X

For each y ∈ X, define xy by:

If NLP(y) is feasible: xy is an optimal solution of NLP(y)

Else: xy is an optimal solution of FNLP(y)

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 18 / 36

Page 31: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

MILP formulation

Theorem

Assume a constraint qualification holds at xy for NLP(y) or FNLP(y),as relevant, for each y ∈ K. Then the following MILP formulation,MILP(K), solves the convex MINLP:

z∗(K) = minx

cTx

s.t. gi(xy) +∇gi(xy)′(x− xy) ≤ 0, ∀ i = 1, . . . ,m, y ∈ K

x ∈ X, xI ∈ Z|I|

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 19 / 36

Page 32: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

Constraint qualification?

These ensure that Lagrange multipliers satisfying KKT conditions exist

Alternatively, it means the problem can be approximated by linearconstraints

Not always satisfied!

Examples:

Linear-independence constraint qualification: Gradient of activeconstraints are linearly independent at

Mangasarian-Fromowitz

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 20 / 36

Page 33: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

Just a Little Nonlinear Programming Theory

Consider a nonlinear program with f convex:

minx

f(x)

s.t. gi(x) ≤ 0, i = 1, . . . ,m

Theorem

Assume x∗ is a local solution of the NLP and a constraint qualificationholds at x∗. Then there exists a Lagrange Multiplier vector λ∗ such that:

0 = ∇f(x∗) +

m∑i=1

λ∗i∇gi(x∗)

gi(x∗) ≤ 0, λ∗i ≥ 0, λ∗i gi(x

∗) = 0 i = 1, . . . ,m

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 21 / 36

Page 34: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

Outer Approximation (Duran and Grossmann, 1986)

0. Initialize set K (e.g., solve NLP relaxation, or ∅)1. Solve MILP(K).

If Infeasible: MINLP is infeasible.Else: Let x be an optimal solution and let y = xI . Note that z∗(K)provides a lower bound on the optimal objective value

2. Solve NLP(y).

If Infeasible: Solve FNLP(y) and let xy be an optimal solution.Else: Let xy be an optimal solution to NLP(y). Note that cT xy

provides an upper bound on the optimal objective value

If cT xy = cT x: Stop. xy is an optimal solution.

3. Add y to K and go to step 1.

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 22 / 36

Page 35: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

Outer-Approximation

Implemented in the GAMS solver DICOPT.

Uses CONOPT to solve NLP(y)Uses CPLEX to solve MILP(K).

Questions

1 Do you really want to solve lots of MILPs to solve one MINLP?

2 Do you have to solve lots of MILPs to solve one MINLP?

Answers

1 No

2 No

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 23 / 36

Page 36: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

Outer-Approximation

Implemented in the GAMS solver DICOPT.

Uses CONOPT to solve NLP(y)Uses CPLEX to solve MILP(K).

Questions

1 Do you really want to solve lots of MILPs to solve one MINLP?

2 Do you have to solve lots of MILPs to solve one MINLP?

Answers

1 No

2 No

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 23 / 36

Page 37: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

Outer-Approximation

Implemented in the GAMS solver DICOPT.

Uses CONOPT to solve NLP(y)Uses CPLEX to solve MILP(K).

Questions

1 Do you really want to solve lots of MILPs to solve one MINLP?

2 Do you have to solve lots of MILPs to solve one MINLP?

Answers

1 No

2 No

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 23 / 36

Page 38: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

The Master

MP(K): Outer-Approximation MILP Master Problem

zmp(K) = min η

s.t. f(xk) +∇f(xk)T (x− xk) ≤ η ∀(xk) ∈ K (MP(K))

gj(xk) +∇gj(xk)T (x− xk) ≤ 0 ∀(xk) ∈ K ∀j

x ∈ X, xI ∈ Z|I|

Thm: zmp(K) ≤ zminlpThm: If K contains the “right” points, then zminlp = zmp(K)Solve this (one) MILP with branch-and-cut

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 24 / 36

Page 39: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

The Master

MP(K): Outer-Approximation MILP Master Problem

zmp(K) = min η

s.t. f(xk) +∇f(xk)T (x− xk) ≤ η ∀(xk) ∈ K (MP(K))

gj(xk) +∇gj(xk)T (x− xk) ≤ 0 ∀(xk) ∈ K ∀j

x ∈ X, xI ∈ Z|I|

Thm: zmp(K) ≤ zminlpThm: If K contains the “right” points, then zminlp = zmp(K)Solve this (one) MILP with branch-and-cut

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 24 / 36

Page 40: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

LP/NLP-BB (Quesada-Grossmann)

Start solving Master MILP(MP (K)) ... using MILPbranch-and-cut.

If xkI ∈ Z|I|+ , then interrupt MILP.Solve NLP(xkI ) get xk

linearize f , gj about (xk)⇒ add linearization to tree

continue MILP tree-search

... until entire tree is fathomed

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 25 / 36

Page 41: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

LP/NLP-BB (Quesada-Grossmann)

Start solving Master MILP(MP (K)) ... using MILPbranch-and-cut.

If xkI ∈ Z|I|+ , then interrupt MILP.Solve NLP(xkI ) get xk

linearize f , gj about (xk)⇒ add linearization to tree

continue MILP tree-search

integerfeasible

... until entire tree is fathomed

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 25 / 36

Page 42: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

LP/NLP-BB (Quesada-Grossmann)

Start solving Master MILP(MP (K)) ... using MILPbranch-and-cut.

If xkI ∈ Z|I|+ , then interrupt MILP.Solve NLP(xkI ) get xk

linearize f , gj about (xk)⇒ add linearization to tree

continue MILP tree-search

... until entire tree is fathomed

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 25 / 36

Page 43: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

LP/NLP-BB (Quesada-Grossmann)

Start solving Master MILP(MP (K)) ... using MILPbranch-and-cut.

If xkI ∈ Z|I|+ , then interrupt MILP.Solve NLP(xkI ) get xk

linearize f , gj about (xk)⇒ add linearization to tree

continue MILP tree-search

... until entire tree is fathomed

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 25 / 36

Page 44: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

LP/NLP Branch and Bound

LP/NLP-based branch-and-bound

Branch-and-cut algorithm with cuts from NLP solves

Create MILP relaxation of MINLP

& refine linearizations

��������

0 ≥ g(x) 0 ≥ g(k) +∇g(k)T (x− x(k))

Search MILP-tree ⇒ faster re-solves

Interrupt MILP tree-search to create new linearizations

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 26 / 36

Page 45: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

LP/NLP Branch and Bound

LP/NLP-based branch-and-bound

Branch-and-cut algorithm with cuts from NLP solves

Create MILP relaxation of MINLP & refine linearizations

������

������

��������

���������

���������

0 ≥ g(x) 0 ≥ g(k) +∇g(k)T (x− x(k))

Search MILP-tree ⇒ faster re-solves

Interrupt MILP tree-search to create new linearizations

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 26 / 36

Page 46: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

LP/NLP-Based Branch and Bound

Algorithmic refinements, e.g. [1]

Advanced MILP search and cut management techniques... remove “old” OA cuts from LP relaxation ⇒ faster LP

Generate cuts at non-integer points (ECP cuts)... generate cuts early (near root) of tree

Strong branching, adaptive node selection & cut management

Fewer nodes, if we add more cuts (e.g. ECP cuts)More cuts make LP harder to solve⇒ remove outdated/inactive cuts from LP relaxation

... balance OA accuracy with LP solvability

Compress OA cuts into Benders cuts can be OK

... interpret as hybrid algorithm, [3]

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 27 / 36

Page 47: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

Extended Cutting Plane (ECP) Method

ECP is variation of OA

Does not solve any NLPs

Linearize f , g around solution of MILP, x(k):If x(k) feasible in linearization, then solved MINLPOtherwise, pick linearization violated by x(k) and add to MILP

Properties of ECP

Convergence follows from OA & Kelley’s cutting plane method

NLP convergence rate is linear

Can visit same integer more than once ...

... single-tree methods use ECP cuts to speed up convergence

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 28 / 36

Page 48: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

LP/NLP-BB = Branch and Cut for MINLP

This really is just a branch-and-cut method for solving MINLP

One slight difference: At integer feasible points, we must solve anNLP and also branch

Branch-and-cut frameworks (like MINTO) have this functionality.

We need an NLP solver: Filter-SQP. Sven Leyffer’s award-winning,filter-sequential quadratic programming (active set) code.

A Very Poor Acronym

FilMINT = Filter + MINTO

Why FilMINT Could Be Good

1 Use MINTO’s “advanced” MIP Features for free.

2 Does Exploiting MIP Technology make any difference?

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 29 / 36

Page 49: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

LP/NLP-BB = Branch and Cut for MINLP

This really is just a branch-and-cut method for solving MINLP

One slight difference: At integer feasible points, we must solve anNLP and also branch

Branch-and-cut frameworks (like MINTO) have this functionality.

We need an NLP solver: Filter-SQP. Sven Leyffer’s award-winning,filter-sequential quadratic programming (active set) code.

A Very Poor Acronym

FilMINT = Filter + MINTO

Why FilMINT Could Be Good

1 Use MINTO’s “advanced” MIP Features for free.

2 Does Exploiting MIP Technology make any difference?

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 29 / 36

Page 50: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

LP/NLP-BB = Branch and Cut for MINLP

This really is just a branch-and-cut method for solving MINLP

One slight difference: At integer feasible points, we must solve anNLP and also branch

Branch-and-cut frameworks (like MINTO) have this functionality.

We need an NLP solver: Filter-SQP. Sven Leyffer’s award-winning,filter-sequential quadratic programming (active set) code.

A Very Poor Acronym

FilMINT = Filter + MINTO

Why FilMINT Could Be Good

1 Use MINTO’s “advanced” MIP Features for free.

2 Does Exploiting MIP Technology make any difference?

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 29 / 36

Page 51: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

Computational Experiments

Convex MINLPs from variety of sources: GAMS MINLP World,MacMINLP, IBM-CMU Team

≈ 50% easy: Solved by B&B solver in < 1 min. (Ignored)37 moderate: Solved by B&B solver in < 1 hour85 hard: Took > 1 hour

Performance Profile

An empirical CDF of relative solver performance

The “probability” that a solver will be at most x times worse(slower) than the best solver for an instance

“High” lines denote more effective solvers

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 30 / 36

Page 52: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

Computational Experiments

Convex MINLPs from variety of sources: GAMS MINLP World,MacMINLP, IBM-CMU Team

≈ 50% easy: Solved by B&B solver in < 1 min. (Ignored)37 moderate: Solved by B&B solver in < 1 hour85 hard: Took > 1 hour

Performance Profile

An empirical CDF of relative solver performance

The “probability” that a solver will be at most x times worse(slower) than the best solver for an instance

“High” lines denote more effective solvers

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 30 / 36

Page 53: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

MINTO v3.1 MILP Features

Preprocessing

Cutting planes

Knapsack covers, flow covers, clique inequalities, implication cuts

Primal heuristic: Diving-based

Branching

Pseudo-cost based branching.

Node selection strategies

Adaptive (Depth first + best estimate).

The MILP Force

Do fancy-pants MILP techniques make a difference for theLP/NLP-BB (QG) Algorithm?

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 31 / 36

Page 54: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

MINTO v3.1 MILP Features

Preprocessing

Cutting planes

Knapsack covers, flow covers, clique inequalities, implication cuts

Primal heuristic: Diving-based

Branching

Pseudo-cost based branching.

Node selection strategies

Adaptive (Depth first + best estimate).

The MILP Force

Do fancy-pants MILP techniques make a difference for theLP/NLP-BB (QG) Algorithm?

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 31 / 36

Page 55: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

YES! Performance Profile: Moderate Instances

QP + IP

0.2

0.4

0.6

0.8

1

1 10

% I

nst

ances

Time Increase Factor

QG

0

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 32 / 36

Page 56: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

(Old) Comparison of Solvers

0

0.2

0.4

0.6

0.8

1

1 10 100 1000

% I

nst

an

ces

Time Increase Factor

QGfilmint

BONMINBONMIN−v2

MINLP−BB

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 33 / 36

Page 57: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

“Better” Linearization-Based Method for MISOCP

For problems with SOC constraints:

‖Aix+ bi‖2 − pTi x+ qi ≤ 0

there exist “better” linearization mechanisms that rely on the powerof projection

Use compact extended LP relaxation of second-order cone constraintsfrom Ben-Tal and Nemirovski [2]Used by Vielma, Ahmed, and Nemhauser [4] to solve some instancesquite effectively

When integer-feasible solution xI = y obtained at a node, solveNLP(y)

SCIP has such a linearization mechanism for SOC constraints

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 34 / 36

Page 58: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

(Convex) Solver Types

α-ECP: Extended Cutting Plane

Bonmin: Both NLP-BB and Linearization-Based

DICOPT: Outer Approximation

KNITRO: Both NLP-BB and Linearization-Based

SCIP: Linearization-Based. (?) (constraints/nonlinear/assumeconvex)

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 35 / 36

Page 59: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

Conclusions

The two main algorithms for convex MINLP are NLP-based branchand bound and “linearization”-based methods such asOuter-Approximation or LP/NLP-BB

If the problem is “mostly” linear, then I have found thatlinearization-basd methods are most effective.

If the problem is “highly nonlinear”, then probably NLP-BB is abetter choice of algorithm family

Often NLP-based B&B is used as a “heuristic” even if the problem isnot convex.

The NLP-solver at the nodes may find some feasible solutionsBut there is no guarantee about the quality of these solutions.

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 36 / 36

Page 60: MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of … · 2016. 6. 30. · MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Je Linderoth Industrial

Linearization Based Methods

K. Abhishek, S. Leyffer, and J. T. Linderoth.FilMINT: An outer-approximation-based solver for nonlinear mixed integer programs.INFORMS Journal on Computing, 22:555–567, 2010.

A. Ben-Tal and A. Nemirovski.On polyhedral approximations of the second-order cone.Mathematics of Operations Research, 26:193–205, 2001.

P. Bonami, G. Cornuejols, A. Lodi, and F. Margot.A feasibility pump for mixed integer nonlinear programs.Mathematical Programming, 119:331–352, 2008.

J. P. Vielma, S. Ahmed, and G. L. Nemhauser.A lifted linear programming branch-and-bound algorithm for mixed integer conic quadraticprograms.INFORMS Journal on Computing, 2008.To appear.

Linderoth/Schweiger (UW-Madison/FAU) MSRI MINLP:Lecture 3 Lecture Notes 36 / 36