a tale of two topics: (i) saa review and (ii) testbed...

31
Introduction SAA Consistency Rate Results Algorithmic Results References A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update. Shane Henderson and Raghu Pasupathy 13th May 2010 Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Upload: others

Post on 24-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

A Tale of Two Topics:(i) SAA Review and (ii) Testbed Update.

Shane Hendersonand

Raghu Pasupathy

13th May 2010

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 2: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

PART I

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 3: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Problem Statement

minimize g(x)

subject to h(x) ≥ 0,

x ∈ D ⊂ IRq,

where

– g : D → IR can only be estimated using the “black box”estimator Gm, where Gm(x) ⇒ g(x) for all x ∈ D and m issome measure of simulation effort;

– h : D → IRn can only be estimated using the “black box”estimator Hm, where Hm(x) ⇒ g(x) for all x ∈ D and m issome measure of simulation effort;

– D ⊆ IRq is a known set, e.g., the non-negative orthant.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 4: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Notes and Some Notation

– The case of known h has been studied far more.

– The feasible region resulting from the constraints h and theregion D are usually assumed to be closed and convex.

– Denote (π∗, v∗) as the set of global minima and the globalminimum value corresponding to the problem. Denote λ∗

as the set of local minima (appropriately defined) of theproblem.

– Usually an element of π∗ or an element of λ∗ is requested.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 5: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Sample Average Approximation (SAA)

Logic:

1. “Generate” a sample-path problem with sample size m.

2. Use a procedure to “solve” the sample-path problem

minimize Gm(x)

subject to h(x) ≥ 0,

x ∈ D ⊂ IRq.

Algorithm Parameters:

(i) procedure for solving the sample-path problems;

(ii) sample size m;

(iii) if sample-path problem can only be solved numerically, theerror-tolerance ǫ to within which the sample-path problemshould be solved.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 6: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

SAA Refinement — Retrospective Approximation (RA)

Logic:

1. “Generate” kth sample-path problem with sample-size mk.2. Use a procedure to solve the kth sample-path problem to

within error-tolerance ǫk. Obtain a retrospective solutionXk.

3. Weight obtained solutions to get

Xk =k

j=1

wjXj, wj ≥ 0,

k∑

j=1

wj = 1.

4. Update k = k + 1 and goto Step 1.

Algorithm Parameters:

(i) procedure for solving the sample-path problems;(ii) sample-size sequence {mk};(iii) error-tolerance sequence {ǫk}.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 7: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Retrospective Approximation

z1

z2

Xi : Approx. solution to the ith sample-path problem

X∗i : True solution to the ith sample-path problem

x∗ : True solution to the problem

x∗

X0

X1

X∗1

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 8: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Retrospective Approximation

z1

z2

Xi : Approx. solution to the ith sample-path problem

X∗i : True solution to the ith sample-path problem

x∗ : True solution to the problem

x∗

X1

X∗1

X2

X∗2

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 9: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Retrospective Approximation

z1

z2

Xi : Approx. solution to the ith sample-path problem

X∗i : True solution to the ith sample-path problem

x∗ : True solution to the problem

x∗

X1

X∗1

X2

X∗2

X3

X∗3

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 10: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Retrospective Approximation

z1

z2

Xi : Approx. solution to the ith sample-path problem

X∗i : True solution to the ith sample-path problem

x∗ : True solution to the problem

x∗

X1

X∗1

X2

X∗2

X3

X∗3

X4

X∗4

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 11: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

SAA and RA — When?

Advantages

1. When sample-path problems have structure so that genericsearch procedures are guaranteed to work well.

2. When sample-path problems have special structure that isknown and can be utilized for efficiency. (Surprisingcounterexample provided by Nemirovski et al. [9].)

3. Advances in deterministic math. programming at ourdisposal, in principle.

Disadvantages

1. When the user cannot be expected to choose anappropriate procedure to solve sample-path problems.

2. Sample-paths are poorly behaved or have no knownstructure, and so choice of procedure is unclear.

3. Incorporation of variance reduction techniques can bedifficult.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 12: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

An Outline of Key Results

1. Consistency

– Convergence of optimal value (SAA and RA).– Convergence of optimal solution (SAA and RA).

2. Speed of Convergence

– CLT-type results for optimal value (SAA and RA).– CLT-type results for optimal solutions (SAA and RA).– Results under special conditions.

3. Algorithmic Results

– Minimum sample size results (SAA).– Quality of solution/confidence interval type results (SAA).– Parameter choice results (RA).

4. Results relating to stochastic feasible regions.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 13: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Consistency (SAA and RA)

Theorem (Shapiro [14])

1. If limm→∞ supx∈D |g(x) − Gm(x)| = 0 wp1, then V∗m → v∗

wp1.

2. If limm→∞ supx∈D |g(x) − Gm(x)| = 0 wp1, D is compact,and g is continuous, then limm→∞ dist(π∗, Π∗

m) → 0 wp1.

– Results carry over to the RA context in a straightforwardmanner.

– Corresponding results for local minima provided by Bastinet al. [1]

– Results on epiconvergence provided by Dupacova andWets [4], and Robinson [12].

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 14: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Consistency — Optimal Value

g(x) = 1 + (1− x)2, 0 < x < 1

Gm(x) = 0.5I(0,1/m)(x) + g(x)I[1/m,1)(x), 0 < x < 1

1/mShane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 15: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Consistency — Optimal Solution

g(x)

Gm(x) = g(x)I(0,m)(x)+ v∗I[m,∞)(x), 0 < x <∞

m

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 16: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Speed of Convergence (SAA and RA) — Optimal Value

Theorem (Pflug [11], Shapiro [14])

Suppose β(m) is a function satisfying limm→∞ β(m) = ∞ suchthat β(m)(Gm − g) ⇒ Y(x) ∈ C(D), where C(D) is the linearspace of continuous functions on D. Then,

β(m)(V∗m − v∗) ⇒ minx∈π

∗Y(x).

– Rate of convergence of “black box” estimator transferredover to optimal value.

– When g is an expectation, the functional CLT condition issatisfied with a Lipschitz condition on Gm, where theLipschitz constant has finite second moment.

– Corresponding CLT on optimal solution can be found inKing and Rockefellar [5], and Shapiro [13].

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 17: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Speed of Convergence — Important Special Cases

Theorem (Kleywegt et al. [6])

Let D be finite and g(x) = E[G(x)], Gm(x) =∑m

i=1 Yi(x) whereY1, Y2, . . . are iid copies of a random variable Y(x). Denoteπ∗(ǫ) = {x : g(x) − v∗ ≤ ǫ} and Π∗

m(δ) = {x : Gm(x) − V∗m ≤ δ}.

Then, if δ ≤ ǫ,

Pr{Π∗m(δ) * π∗(ǫ)} ≤ |π∗ \ π∗(ǫ)| exp{−mγ(δ, ǫ)},

where γ(δ, ǫ) = minx∈π∗\π∗(ǫ)Ix(−δ), Ix(·) being the rate

function associated with the sequence {Gm(x)}.

– The probability of not obtaining an ǫ-optimal solutiondrops exponentially.

– This result forms the essence of most minimum sample sizeresults [7, 14].

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 18: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Speed of Convergence — Important Special Cases

Theorem (Shapiro and Homem-de-Mello [15])

Let g be a finite-valued function having a sharp minimum, i.e.,g satisfies g(x) ≥ g(x∗) + c‖x − x∗‖ for x ∈ D, where c is apositive constant and x∗ is the unique minimum. Letg(x) = E[G(x)] for Gm(x) =

∑mi=1 Yi(x), where Y1(x), Y2(x), . . .

are iid copies of a random variable Y(x) having fixed finitesupport. Then, if G is convex, and the set D is closed andconvex, Π∗

m = {x∗} for large enough m wp1.

– The result is easily extended to contexts where π∗ is not asingleton.

– The earlier result on exponential convergence holds in thisspecial case as well.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 19: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Results for Solution Quality

Theorem (Mak et al. [8])

Let g(x) = E[G(x)] for Gm(x) =∑m

i=1 Yi(x), whereY1(x), Y2(x), . . . are iid copies of a random variable Y(x). Then,

1. E[V∗m] ≤ v∗;

2. E[V∗m+1] ≥ E[V∗

m];

3. If x ∈ D, 0 ≤ g(x) − v∗ ≤ g(x) − E[V∗m].

– The result is very general, e.g., D need not be convex. (Seealso Birge [3] for related results.)

– Mak et al. [8] use the above result to construct confidenceintervals on the optimality gap of a candidate solution.

– Bayraksan and Morton [2] extend this to a sequentialprocedure for constructing confidence intervals.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 20: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Parameter Choice in RA

How to choose the sequence of sample sizes {mk}, and thesequence of error tolerances {ǫk} in RA? Consider the followingthree conditions.

C.1. When the numerical procedure used to solve sample-pathproblems exhibits

(a) linear convergence: lim infk→∞ ǫk√

mk−1 > 0;

(b) polynomial convergence: lim infk→∞log(1/

√mk−1)

log(ǫk) > 0.

C.2. lim supk→∞

(

∑kj=1 mj

)

ǫ2k < ∞.

C.3. lim supk→∞

(

∑kj=1 mj

)

m−1k < ∞.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 21: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

Parameter Choice in RA

Theorem (Pasupathy [10])

If the sequences {ǫk}, {mk} satisfy conditions C.1, C.2, and C.3,and π∗ = {x∗}, then Wk‖Xk − x∗‖2 = Op(1).

Theorem (Pasupathy [10])

If even one of the conditions C.1, C.2, or C.3 is violated, andπ∗ = {x∗}, Wk‖Xk − x∗‖2 p→∞.

Exp. Growth Pol. Growth Lin. Growth(mk = e1.1mk−1) (mk = m1.1

k−1) (mk = 1.1mk−1)

Pol. Conv. N Y Y

Lin. Conv. N N Y

S-Lin. Conv. N N NA

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 22: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

F. Bastin, C. Cirillo, and P. L. Toint.

Convergence theory for nonconvex stochastic programming with an application to mixedlogit.Mathematical Programming, 108:207–234, 2006.

G. Bayraksan and D. P. Morton.

A sequential sampling procedure for stochastic programming.Operations Research, 2009.To Appear.

J. R. Birge.

The value of the stochastic solution in stochastic linear programs with recourse.Math. Programming, 24:314–325, 1982.

J. Dupacova and R. J. B. Wets.

Asymptotic behavior of statistical estimators and of optimal solutions of stochasticoptimization problems.The Annals of Statistics, 16:1517–1549, 1988.

A. J. King and R. T. Rockefellar.

Asymptotic theory for solutions in statistical estimation and stochastic programming.Mathematics of Operations Research, 18:148–162, 1993.

A. J. Kleywegt, A. Shapiro, and T. Homem-de-Mello.

The sample average approximation method for stochastic discrete optimization.SIAM Journal on Optimization, 12:479–502, 2001.

J. Luedtke and S. Ahmed.

A sample approximation approach for optimization with probabilistic constraints.SIAM Journal on Optimization, 2008.To Appear.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 23: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

W. K. Mak, D. P. Morton, and R. K. Wood.

Monte Carlo bounding techniques for determining solution quality in stochasticprograms.Operations Research Letters, 24:47–56, 1999.

A. Nemirovski, A. Juditsky, G. Lan, and A. Shapiro.

Robust stochastic approximation approach to stochastic programming.SIAM Journal on Optimization, (4), 2009.

R. Pasupathy.

On choosing parameters in retrospective-approximation algorithms for stochastic rootfinding and simulation optimization.2010.To Appear in Operations Research.

G. C. Pflug.

Optimization of Stochastic Models: The Interface Between Simulation and Optimization.Kluwer, Boston, MA., 1996.

S. Robinson.

Analysis of sample-path optimization.Mathematics of Operations Research, 21:513–528, 1996.

A. Shapiro.

Statistical inference of stochastic optimization problems.In S. P. Uryasev, editor, Probabilistic Constrained Optimization: Methodology andApplications, pages 282–304. Kluwer Academic Publishers, 2000.

A. Shapiro.

Monte Carlo sampling methods.In A. Ruszczynski and Shapiro, editors, Stochastic Programming, Handbooks inOperations Research and Management Science, pages 353–426. Elsevier, 2004.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 24: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

Introduction SAA Consistency Rate Results Algorithmic Results References

A. Shapiro and T. Homem-de-Mello.

On the rate of convergence of optimal solutions of Monte Carlo approximations ofstochastic programs.SIAM Journal on Optimization, 11(1):70–86, 2000.

Shane Henderson and Raghu Pasupathy A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 25: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

SO Testbed Testbed Organization Current State Discussion

PART II

Shane Henderson ([email protected]) and Raghu Pasupathy ([email protected])A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 26: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

SO Testbed Testbed Organization Current State Discussion

Introducing ... an SO Testbed

Shane Henderson ([email protected]) and Raghu Pasupathy ([email protected])A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 27: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

SO Testbed Testbed Organization Current State Discussion

Recall Objectives

– Fill the stated and yet unfulfilled need for acarefully designed testbed of SO problems.

(The stochastic programming community has a few of itsown libraries, e.g., SIPLIB for stochastic integer programs,POSTS for linear recourse problems.)

– Actively draw attention to finite-time performance ofalgorithms, through the use of finite-time performancemeasures.

– Identify particular problem types that defy efficientsolution.

– Increase visibility and usage of SO formulation andsolution.

Shane Henderson ([email protected]) and Raghu Pasupathy ([email protected])A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 28: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

SO Testbed Testbed Organization Current State Discussion

Problem Organization Within Testbed

e.g., Call Center Staffing

e.g., Ranking & Selection

e.g., Newsvendor Problem

e.g., Parameter Estimation e.g., (s,S) Inventory

e.g., Rosenbrock Function e.g., Ambulance Location

All Problems

Integer-OrderedVariables

CategoricalVariables

ContinuousVariables

Constrained Unconstrained

ConstrainedUnconstrained

Smooth Non-Smooth

Smooth Non-Smooth

Shane Henderson ([email protected]) and Raghu Pasupathy ([email protected])A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 29: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

SO Testbed Testbed Organization Current State Discussion

Where do we stand?

e.g., Call Center Staffing

Unconstrained(3)

Unconstrained(7)

Non-Smooth(3)

ContinuousVariables (10)

All Problems(23)

Constrained(3)

CategoricalVariables (1)

e.g., Ranking & Selection

Integer-OrderedVariables (12)

Constrained(9)

e.g., Newsvendor Problem

Smooth(0)

Non-Smooth(3)

Smooth(4)

e.g., Parameter Estimation e.g., (s,S) Inventory

e.g., Rosenbrock Function e.g., Ambulance Location

Shane Henderson ([email protected]) and Raghu Pasupathy ([email protected])A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 30: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

SO Testbed Testbed Organization Current State Discussion

Where do we stand and what’s next?

1. Good: The testbed is close to having a critical mass ofproblems.

2. Bad: Not as much variety as we would like to see.

3. Bad: Small number of continuous-variable andcategorical-variable problems.

1. One PhD and one undergrad comb past WSC proceedings.

2. Large source of potential continuous-variable problems:Approximate DP.

Shane Henderson ([email protected]) and Raghu Pasupathy ([email protected])A Tale of Two Topics:(i) SAA Review and (ii) Testbed

Page 31: A Tale of Two Topics: (i) SAA Review and (ii) Testbed Update.users.iems.northwestern.edu/~nelsonb/workshop/Henderson.pdfβ(m)(V∗ m −v ∗) ⇒ minx∈π∗Y(x). – Rate of convergence

SO Testbed Testbed Organization Current State Discussion

Group Discussion

1. What does the apparent dearth of problems tell us?

2. Should particular categories be coalesced?

3. Is it time yet to launch the testbed?

4. To submit or simply view, point your browser towww.simopt.org.

Shane Henderson ([email protected]) and Raghu Pasupathy ([email protected])A Tale of Two Topics:(i) SAA Review and (ii) Testbed