![Page 1: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/1.jpg)
http://www.mosek.com
Solving Linear OptimizationProblems with MOSEK.
Bo Jensen ∗
MOSEK ApS,Fruebjergvej 3, Box 16, 2100 Copenhagen,
Denmark.Email: [email protected]
INFORMS Annual Meeting Seattle Nov. 7, 2007
∗Erling D. Andersen
![Page 2: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/2.jpg)
Introduction
2 / 26
![Page 3: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/3.jpg)
Topics
Introduction
Topics
The linear optimizer
The simplexoptimizers
Computationalresults
Conclusions
3 / 26
■ The problem:(P ) min cT x
st Ax = b,
x ≥ 0.
■ The linear optimizers.
◆ Interior-point optimizer (Not main focus in this talk).◆ Simplex optimizer.
■ What is the recent improvements?■ What is the (relative) performance?
![Page 4: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/4.jpg)
The linear optimizer
Introduction
Topics
The linear optimizer
The simplexoptimizers
Computationalresults
Conclusions
4 / 26
The general flow :
■ Presolve.■ Form the reduced primal or dual.■ Scale (optimizer specific).■ Optimize (interior-point or simplex).■ Basis identification (interior-point only).■ Undo scaling and dualizing.■ Postsolve.
![Page 5: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/5.jpg)
The simplex optimizers
5 / 26
![Page 6: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/6.jpg)
What makes a good simplex optimizer ?
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
6 / 26
■ Exploit sparsity (i.e. LU and FTRAN and BTRANroutines).
■ Exploit problem dependent structure.■ Choose right path (i.e. good pricing strategy).■ Long steps (i.e. avoid degeneracy).■ Numerical stability (i.e. reliable and consistent results).■ Fast hotstarts (i.e. MIP and other hotstart applications).■ Other tricks.
![Page 7: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/7.jpg)
MOSEK simplex-overview
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
7 / 26
■ Primal and dual simplex optimizer.
◆ Efficient cold start and warm start.◆ Crashes an initial basis.◆ Multiple pricing options:
■ Full (Dantzig).■ Partial.■ Approximate/exact steepest edge.■ Hybrid.
◆ Degeneration handling.
■ Revised simplex algorithm + many enhancements.■ Many enhancements still possible!.
![Page 8: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/8.jpg)
Exploiting sparsity aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
8 / 26
■ Simplex algs. require solution of the linear equationsystems
Bf = A:j and BT g = ei.
in each iteration.
![Page 9: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/9.jpg)
Exploiting sparsity aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
8 / 26
■ Simplex algs. require solution of the linear equationsystems
Bf = A:j and BT g = ei.
in each iteration.■ Assume a sparse LU factorization of the basis
B = LU.
![Page 10: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/10.jpg)
Exploiting sparsity aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
8 / 26
■ Simplex algs. require solution of the linear equationsystems
Bf = A:j and BT g = ei.
in each iteration.■ Assume a sparse LU factorization of the basis
B = LU.
■ f can be computed as follow. Solve
Lf̄ = A:j
and thenUf = f̄ .
![Page 11: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/11.jpg)
Exploiting sparsity aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
8 / 26
■ Simplex algs. require solution of the linear equationsystems
Bf = A:j and BT g = ei.
in each iteration.■ Assume a sparse LU factorization of the basis
B = LU.
■ f can be computed as follow. Solve
Lf̄ = A:j
and thenUf = f̄ .
■ Simple implementation requires O(nz(L) + nz(U)) flops.
![Page 12: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/12.jpg)
Exploiting sparsity aggressively (continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
9 / 26
■ Consider the simple example:
10 1x 0 1
f̄1
f̄2
f̄3
=
0x
0
![Page 13: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/13.jpg)
Exploiting sparsity aggressively (continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
9 / 26
■ Consider the simple example:
10 1x 0 1
f̄1
f̄2
f̄3
=
0x
0
■ Clearly sparsity in the RHS can be exploited! (doneextensively in MOSEK).
![Page 14: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/14.jpg)
Exploiting sparsity aggressively (continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
9 / 26
■ Consider the simple example:
10 1x 0 1
f̄1
f̄2
f̄3
=
0x
0
■ Clearly sparsity in the RHS can be exploited! (doneextensively in MOSEK).
■ Gilbert and Peierls [GIL:88] demonstrate how to solve thetriangular system in O(minimal number of flops).
![Page 15: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/15.jpg)
Exploiting sparsity aggressively (continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
9 / 26
■ Consider the simple example:
10 1x 0 1
f̄1
f̄2
f̄3
=
0x
0
■ Clearly sparsity in the RHS can be exploited! (doneextensively in MOSEK).
■ Gilbert and Peierls [GIL:88] demonstrate how to solve thetriangular system in O(minimal number of flops).
■ Aim: Solves with L and U and updates to the LU shouldrun in O(minimal number of flops) and not in O(m) forinstance.
![Page 16: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/16.jpg)
Exploiting sparsity aggressively (continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
9 / 26
■ Consider the simple example:
10 1x 0 1
f̄1
f̄2
f̄3
=
0x
0
■ Clearly sparsity in the RHS can be exploited! (doneextensively in MOSEK).
■ Gilbert and Peierls [GIL:88] demonstrate how to solve thetriangular system in O(minimal number of flops).
■ Aim: Solves with L and U and updates to the LU shouldrun in O(minimal number of flops) and not in O(m) forinstance.
■ Drawback: Both L and U must be stored row and columnwise because solves with LT and UT are required too.
![Page 17: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/17.jpg)
Primal (dual) Degeneracy
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
10 / 26
The simplex optimizer may take very small or zero step sizes,why ?
![Page 18: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/18.jpg)
Primal (dual) Degeneracy
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
10 / 26
The simplex optimizer may take very small or zero step sizes,why ?
■ Primal step size δp:lB ≤ xB − δpB
−1aq ≤ uB
![Page 19: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/19.jpg)
Primal (dual) Degeneracy
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
10 / 26
The simplex optimizer may take very small or zero step sizes,why ?
■ Primal step size δp:lB ≤ xB − δpB
−1aq ≤ uB
■ Basic variables on a bound may imply a zero primal step.
![Page 20: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/20.jpg)
Primal (dual) Degeneracy
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
10 / 26
The simplex optimizer may take very small or zero step sizes,why ?
■ Primal step size δp:lB ≤ xB − δpB
−1aq ≤ uB
■ Basic variables on a bound may imply a zero primal step.■ Dual step size δd:
cj − yT Aj − (+)δd(eiB−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
![Page 21: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/21.jpg)
Primal (dual) Degeneracy
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
10 / 26
The simplex optimizer may take very small or zero step sizes,why ?
■ Primal step size δp:lB ≤ xB − δpB
−1aq ≤ uB
■ Basic variables on a bound may imply a zero primal step.■ Dual step size δd:
cj − yT Aj − (+)δd(eiB−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
■ Non basic variables with zero reduced cost may imply azero dual step.
Degeneration posses both a theoretical and a practicalproblem for the simplex optimizer !
![Page 22: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/22.jpg)
Primal (dual) Degeneracy
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
10 / 26
The simplex optimizer may take very small or zero step sizes,why ?
■ Primal step size δp:lB ≤ xB − δpB
−1aq ≤ uB
■ Basic variables on a bound may imply a zero primal step.■ Dual step size δd:
cj − yT Aj − (+)δd(eiB−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
■ Non basic variables with zero reduced cost may imply azero dual step.
Degeneration posses both a theoretical and a practicalproblem for the simplex optimizer !
What is our options ?
![Page 23: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/23.jpg)
Primal (dual) Degeneracy
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
10 / 26
The simplex optimizer may take very small or zero step sizes,why ?
■ Primal step size δp:lB ≤ xB − δpB
−1aq ≤ uB
■ Basic variables on a bound may imply a zero primal step.■ Dual step size δd:
cj − yT Aj − (+)δd(eiB−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
■ Non basic variables with zero reduced cost may imply azero dual step.
Degeneration posses both a theoretical and a practicalproblem for the simplex optimizer !
What is our options ?
One approach is to perturb lj and uj (cj).
![Page 24: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/24.jpg)
Primal (dual) Degeneracy (continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
11 / 26
MOSEK 5 has been improved on degenerated problems:
■ Better and more aggressive perturbation scheme.
![Page 25: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/25.jpg)
Primal (dual) Degeneracy (continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
11 / 26
MOSEK 5 has been improved on degenerated problems:
■ Better and more aggressive perturbation scheme.■ Sparsity issues important (very tricky).
![Page 26: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/26.jpg)
Primal (dual) Degeneracy (continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
11 / 26
MOSEK 5 has been improved on degenerated problems:
■ Better and more aggressive perturbation scheme.■ Sparsity issues important (very tricky).■ Clean up perturbations with dual (primal) simplex.
![Page 27: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/27.jpg)
Primal (dual) Degeneracy (continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
11 / 26
MOSEK 5 has been improved on degenerated problems:
■ Better and more aggressive perturbation scheme.■ Sparsity issues important (very tricky).■ Clean up perturbations with dual (primal) simplex.■ Many examples where ”tailed” solves are substantial
reduced.
![Page 28: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/28.jpg)
Primal (dual) Degeneracy (continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
11 / 26
MOSEK 5 has been improved on degenerated problems:
■ Better and more aggressive perturbation scheme.■ Sparsity issues important (very tricky).■ Clean up perturbations with dual (primal) simplex.■ Many examples where ”tailed” solves are substantial
reduced.■ Still room for improvement.
![Page 29: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/29.jpg)
Dual bound flipping idea used more aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
12 / 26
Dual step size δd :cj − yT Aj − (+)δd(eiB
−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
A ranged variable i.e. −∞ < lj < xj < uj < ∞ may not bebinding in dual min-ratio if profitable.
![Page 30: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/30.jpg)
Dual bound flipping idea used more aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
12 / 26
Dual step size δd :cj − yT Aj − (+)δd(eiB
−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
A ranged variable i.e. −∞ < lj < xj < uj < ∞ may not bebinding in dual min-ratio if profitable.
■ This involves flipping nonbasic variables to oppositebound to remain dual feasible and cost one extra solve.
![Page 31: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/31.jpg)
Dual bound flipping idea used more aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
12 / 26
Dual step size δd :cj − yT Aj − (+)δd(eiB
−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
A ranged variable i.e. −∞ < lj < xj < uj < ∞ may not bebinding in dual min-ratio if profitable.
■ This involves flipping nonbasic variables to oppositebound to remain dual feasible and cost one extra solve.
■ Longer dual steplengths.
![Page 32: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/32.jpg)
Dual bound flipping idea used more aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
12 / 26
Dual step size δd :cj − yT Aj − (+)δd(eiB
−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
A ranged variable i.e. −∞ < lj < xj < uj < ∞ may not bebinding in dual min-ratio if profitable.
■ This involves flipping nonbasic variables to oppositebound to remain dual feasible and cost one extra solve.
■ Longer dual steplengths.■ Reduces degeneracy.
![Page 33: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/33.jpg)
Dual bound flipping idea used more aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
12 / 26
Dual step size δd :cj − yT Aj − (+)δd(eiB
−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
A ranged variable i.e. −∞ < lj < xj < uj < ∞ may not bebinding in dual min-ratio if profitable.
■ This involves flipping nonbasic variables to oppositebound to remain dual feasible and cost one extra solve.
■ Longer dual steplengths.■ Reduces degeneracy.■ Less iterations.
![Page 34: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/34.jpg)
Dual bound flipping idea used more aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
12 / 26
Dual step size δd :cj − yT Aj − (+)δd(eiB
−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
A ranged variable i.e. −∞ < lj < xj < uj < ∞ may not bebinding in dual min-ratio if profitable.
■ This involves flipping nonbasic variables to oppositebound to remain dual feasible and cost one extra solve.
■ Longer dual steplengths.■ Reduces degeneracy.■ Less iterations.■ More flexibility in pivot choice (i.e. potentially more
stable).
![Page 35: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/35.jpg)
Dual bound flipping idea used more aggressively
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
12 / 26
Dual step size δd :cj − yT Aj − (+)δd(eiB
−1N)j ≥ 0 ∀j ∈ NL
cj − yT Aj − (+)δd(eiB−1N)j ≤ 0 ∀j ∈ NU
A ranged variable i.e. −∞ < lj < xj < uj < ∞ may not bebinding in dual min-ratio if profitable.
■ This involves flipping nonbasic variables to oppositebound to remain dual feasible and cost one extra solve.
■ Longer dual steplengths.■ Reduces degeneracy.■ Less iterations.■ More flexibility in pivot choice (i.e. potentially more
stable).■ Improves sparsity of the basis when degenerated! (i.e. if
xBi becomes feasible no basis exchange is needed).
![Page 36: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/36.jpg)
Dual bound flipping idea used more aggressively(continued)
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
13 / 26
Bound flipping examples:
Iter Time
Problem Rows Cols NB WB NB WB
osa-60 10280 232966 6938 5111 58.12 8.84
world 34506 32734 54566 32606 218.81 50.03
pds-40 66844 212859 34274 26599 96.51 18.48
ken-18 105127 154699 151203 51452 258.18 13.92
client 27216 20567 80555 63660 208.40 84.09
WB = MOSEK 5 Dual simplex with bound flips
NB = MOSEK 5 Dual simplex with no bound flips
![Page 37: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/37.jpg)
Numerical stability
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
14 / 26
■ Improving numerical stability.
◆ Moved LU update before updating solution.
![Page 38: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/38.jpg)
Numerical stability
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
14 / 26
■ Improving numerical stability.
◆ Moved LU update before updating solution.
■ Saves one solve with L in eiT B−1 [GOL:77].
■ More stable approach.
![Page 39: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/39.jpg)
Numerical stability
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
14 / 26
■ Improving numerical stability.
◆ Moved LU update before updating solution.
■ Saves one solve with L in eiT B−1 [GOL:77].
■ More stable approach.
◆ Better handling of singularities (sing. variables aretemporary fixed).
![Page 40: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/40.jpg)
Numerical stability
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
14 / 26
■ Improving numerical stability.
◆ Moved LU update before updating solution.
■ Saves one solve with L in eiT B−1 [GOL:77].
■ More stable approach.
◆ Better handling of singularities (sing. variables aretemporary fixed).
◆ Switch to safe mode if deemed unstable.
![Page 41: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/41.jpg)
Network optimizer
Introduction
The simplexoptimizers
What makes a goodsimplex optimizer ?
MOSEKsimplex-overview
Exploiting sparsityaggressively
Primal (dual)Degeneracy
Dual bound flippingidea used moreaggressively
Numerical stability
Network optimizer
Computationalresults
Conclusions
15 / 26
MOSEK 5 features a network simplex optimizer.
■ Solves pure network flow problems (i.e. LP’s with twonon-zeros in each column either 1 or -1).
■ Can extract embedded network structure in a model (i.e.network with side constraints).
■ Using standard interface, only one parameter has to beset.
■ Huge problems can be solved in limited time, for instancea problem with 8 million variables can be solved in lessthan 200 seconds.
![Page 42: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/42.jpg)
Computational results
16 / 26
![Page 43: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/43.jpg)
Test setup
Introduction
The simplexoptimizers
Computationalresults
Test setup
Network Vs.Standard simplex
Primal Simplex
Dual Simplex
Numerical difficultproblems-primalsimplex
Numerical difficultproblems-dualsimplex
Conclusions
17 / 26
■ 577 problems (mixed size).■ A Dual Core server with 4GB RAM running Windows
2003 (Intel CPU).■ A Quad Core server with 8GB RAM running Windows
2003 (Intel CPU).■ See [HM:07] for a benchmark comparing Mosek with
other solvers.
All results presented in one table is obtained using one of thetwo computers only.
![Page 44: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/44.jpg)
Network Vs. Standard simplex
Introduction
The simplexoptimizers
Computationalresults
Test setup
Network Vs.Standard simplex
Primal Simplex
Dual Simplex
Numerical difficultproblems-primalsimplex
Numerical difficultproblems-dualsimplex
Conclusions
18 / 26
small mediumnetw psim dsim netw psim dsim
Num. 30 30 30 43 43 43Firsts 30 0 1 43 0 0Total time 13.7 114.8 27.8 589.9 10676.6 3015.2G. avg. 0.39 2.42 0.70 6.30 91.74 19.70
largenetw psim dsim
Num. 2 2 2Firsts 2 0 0Total time 366.3 2905.8 968.9G. avg. 182.98 1115.71 468.76
Table 1: Performance of the network flow, primal simplex anddual simplex optimizer on pure network problems.
![Page 45: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/45.jpg)
Primal Simplex
Introduction
The simplexoptimizers
Computationalresults
Test setup
Network Vs.Standard simplex
Primal Simplex
Dual Simplex
Numerical difficultproblems-primalsimplex
Numerical difficultproblems-dualsimplex
Conclusions
19 / 26
small medium large5 4 5 4 5 4
Num. 399 399 148 148 30 30Firsts 329 245 91 62 22 11Total time 100.4 101.7 2425.3 8962.3 29905.2 39333.2G. avg. 0.06 0.07 7.49 9.24 591.39 746.01
Table 2: Performance of the version 4 and version 5 primalsimplex optimizer
![Page 46: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/46.jpg)
Dual Simplex
Introduction
The simplexoptimizers
Computationalresults
Test setup
Network Vs.Standard simplex
Primal Simplex
Dual Simplex
Numerical difficultproblems-primalsimplex
Numerical difficultproblems-dualsimplex
Conclusions
20 / 26
small medium large5 4 5 4 5 4
Num. 412 412 150 150 21 21Firsts 198 286 133 22 18 5Total time 84.8 106.4 1852.9 7611.3 23678.9 38994.3G. avg. 0.10 0.08 4.65 8.70 544.44 1065.24
Table 3: Performance of the version 4 and version 5 dual sim-plex optimizer
![Page 47: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/47.jpg)
Numerical difficult problems-primal simplex
Introduction
The simplexoptimizers
Computationalresults
Test setup
Network Vs.Standard simplex
Primal Simplex
Dual Simplex
Numerical difficultproblems-primalsimplex
Numerical difficultproblems-dualsimplex
Conclusions
21 / 26
small medium large5 4 5 4 5 4
Num. 9 9 19 19 2 2Firsts 5 5 13 6 2 0Total time 2.7 2.8 235.9 319.6 1297.7 1503.3G. avg. 0.19 0.18 7.19 9.54 413.26 464.04Fails 0 0 0 3 0 3
Table 4: Performance of the version 4 and 5 of the primal sim-plex optimizer on numerical difficult problems.
![Page 48: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/48.jpg)
Numerical difficult problems-dual simplex
Introduction
The simplexoptimizers
Computationalresults
Test setup
Network Vs.Standard simplex
Primal Simplex
Dual Simplex
Numerical difficultproblems-primalsimplex
Numerical difficultproblems-dualsimplex
Conclusions
22 / 26
small medium large5 4 5 4 5 4
Num. 11 11 19 19 4 4Firsts 7 6 13 6 4 0Total time 3.9 6.6 3198.3 345.9 4736.3 12820.5G. avg. 0.24 0.31 8.44 9.67 802.24 2525.35Fails 0 0 0 1 0 1
Table 5: Performance of the version 4 and 5 dual simplex opti-mizer on numerical difficult problems.
![Page 49: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/49.jpg)
Conclusions
23 / 26
![Page 50: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/50.jpg)
Conclusions
Introduction
The simplexoptimizers
Computationalresults
Conclusions
ConclusionsA number openissues exists
References
24 / 26
■ Simplex:
◆ MOSEK 5 substantial faster than MOSEK 4.◆ MOSEK 5 more stable than MOSEK 4.◆ Dual simplex faster than primal.
![Page 51: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/51.jpg)
A number open issues exists
Introduction
The simplexoptimizers
Computationalresults
Conclusions
ConclusionsA number openissues exists
References
25 / 26
■ Simplex:
◆ Degeneracy (non-perturbation method might beneeded in extreme cases).
◆ Improve primal pricing.◆ Better crashing on special problems.◆ Choose more sparse path.
![Page 52: 2007 : Solving Linear Problems with MOSEK (Seattle 2007)](https://reader033.vdocuments.site/reader033/viewer/2022051314/55493acfb4c9050a4d8b4bc3/html5/thumbnails/52.jpg)
References
Introduction
The simplexoptimizers
Computationalresults
Conclusions
ConclusionsA number openissues exists
References
26 / 26
[HM:07] H.Mittelmann http://plato.la.asu.edu/bench.html
[GIL:88] J. R. Gilbert and T. Peierls, ”Sparse partial pivoting in timeproportional to arithmetic operations”, SIAM J. Sci. Statist.Comput., 9, 1988, pp. 862–874.
[GOL:77] D. Goldfarb, ”On the Bartels-Golub decomposition forlinear programming bases,” Mathematical. Programming, 13,1977, pp 272-279
[KOS:02] E. Kostina, ”The Long Step Rule in the Bounded-VariableDual Simplex Method: Numerical Experiments”,Mathematical Methods of Operations Research, 55 2002, I. 3.
[MAR:03] Maros I, ”A Generalized Dual Phase-2 SimplexAlgorithm”, European Journal of Operational Research, 149,2003, pp. 1–16