the cross-entropy method - university of...
TRANSCRIPT
![Page 1: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/1.jpg)
The Cross-Entropy MethodA Unified Approach to Rare Event Simulation and
Stochastic Optimization
∗Dirk P. Kroese Reuven Y. Rubinstein
∗Department of Mathematics, The University of Queensland, Australia
Faculty of Industrial Engineering and Management, Technion, Israel
The Cross-Entropy Method – p. 1/37
![Page 2: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/2.jpg)
Contents
1. Introduction
2. CE Methodology
3. Application: Max-Cut Problem, etc.
4. Some Theory on CE
5. Conclusion
The Cross-Entropy Method – p. 2/37
![Page 3: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/3.jpg)
CE Matters
Book: R.Y. Rubinstein and D.P. Kroese.The Cross-Entropy Method: A Unified
Approach to Combinatorial Optimization,
Monte Carlo Simulation and Machine Learn-
ing, Springer-Verlag, New York, 2004.
The Cross-Entropy Method – p. 3/37
![Page 4: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/4.jpg)
CE Matters
Book: R.Y. Rubinstein and D.P. Kroese.The Cross-Entropy Method: A Unified
Approach to Combinatorial Optimization,
Monte Carlo Simulation and Machine Learn-
ing, Springer-Verlag, New York, 2004.
Special Issue: Annals of Operations Research (Jan 2005).
The Cross-Entropy Method – p. 3/37
![Page 5: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/5.jpg)
CE Matters
Book: R.Y. Rubinstein and D.P. Kroese.The Cross-Entropy Method: A Unified
Approach to Combinatorial Optimization,
Monte Carlo Simulation and Machine Learn-
ing, Springer-Verlag, New York, 2004.
Special Issue: Annals of Operations Research (Jan 2005).
The CE home page:
http://www.cemethod.org
The Cross-Entropy Method – p. 3/37
![Page 6: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/6.jpg)
Introduction
The Cross-Entropy Method was originally developed as asimulation method for the estimation of rare event probabilities:
Estimate P(S(X) ≥ γ)
X: random vector/process taking values in some set X .S: real-values function on X .
The Cross-Entropy Method – p. 4/37
![Page 7: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/7.jpg)
Introduction
The Cross-Entropy Method was originally developed as asimulation method for the estimation of rare event probabilities:
Estimate P(S(X) ≥ γ)
X: random vector/process taking values in some set X .S: real-values function on X .
It was soon realised that the CE Method could also be used as anoptimization method:
Determine maxx∈X S(x)
The Cross-Entropy Method – p. 4/37
![Page 8: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/8.jpg)
Some Applications
Combinatorial Optimization (e.g., Travelling Salesman,Maximal Cut and Quadratic Assignment Problems)
Noisy Optimization (e.g., Buffer Allocation, FinancialEngineering)
Multi-Extremal Continuous Optimization
Pattern Recognition, Clustering and Image Analysis
Production Lines and Project Management
Network Reliability Estimation
Vehicle Routing and Scheduling
DNA Sequence Alignment
The Cross-Entropy Method – p. 5/37
![Page 9: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/9.jpg)
A Multi-extremal function
-1-0.5
00.5
1 -1
-0.5
0
0.5
1
0
5
10
15
20
25
The Cross-Entropy Method – p. 6/37
![Page 10: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/10.jpg)
A Maze Problem
The Optimal Trajectory
0 2 4 6 8 10 12 14 16 18 200
2
4
6
8
10
12
14
16
18
20
The Cross-Entropy Method – p. 7/37
![Page 11: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/11.jpg)
A Maze Problem
Iteration 1:
0 2 4 6 8 10 12 14 16 18 200
2
4
6
8
10
12
14
16
18
20
The Cross-Entropy Method – p. 8/37
![Page 12: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/12.jpg)
A Maze Problem
Iteration 2:
0 2 4 6 8 10 12 14 16 18 200
2
4
6
8
10
12
14
16
18
20
The Cross-Entropy Method – p. 9/37
![Page 13: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/13.jpg)
A Maze Problem
Iteration 3:
0 2 4 6 8 10 12 14 16 18 200
2
4
6
8
10
12
14
16
18
20
The Cross-Entropy Method – p. 10/37
![Page 14: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/14.jpg)
A Maze Problem
Iteration 4:
0 2 4 6 8 10 12 14 16 18 200
2
4
6
8
10
12
14
16
18
20
The Cross-Entropy Method – p. 11/37
![Page 15: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/15.jpg)
Example 1: Rare Event Simulation
Consider a randomly weighted graph:
BA
������
���
������
The random weights X1, . . . , X5 are independent andexponentially distributed with means u1, . . . , u5.
Find the probability that the length of the shortest path from A toB is greater than or equal to γ.
The Cross-Entropy Method – p. 12/37
![Page 16: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/16.jpg)
Example 1: Rare Event Simulation
Consider a randomly weighted graph:
BA
������
���
������
The random weights X1, . . . , X5 are independent andexponentially distributed with means u1, . . . , u5.
Find the probability that the length of the shortest path from A toB is greater than or equal to γ.
The Cross-Entropy Method – p. 12/37
![Page 17: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/17.jpg)
Crude Monte Carlo (CMC)
Define X = (X1, . . . , X5) and u = (u1, . . . , u5). Let S(X) bethe length of the shortest path from node A to node B.
The Cross-Entropy Method – p. 13/37
![Page 18: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/18.jpg)
Crude Monte Carlo (CMC)
Define X = (X1, . . . , X5) and u = (u1, . . . , u5). Let S(X) bethe length of the shortest path from node A to node B.We wish to estimate
` = P(S(X) ≥ γ) = EI{S(X)≥γ} .
The Cross-Entropy Method – p. 13/37
![Page 19: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/19.jpg)
Crude Monte Carlo (CMC)
Define X = (X1, . . . , X5) and u = (u1, . . . , u5). Let S(X) bethe length of the shortest path from node A to node B.We wish to estimate
` = P(S(X) ≥ γ) = EI{S(X)≥γ} .
This can be done via Crude Monte Carlo: sample independentvectors from density f(x;u) =
∏5j=1 exp(−xj/uj)/uj , and
estimate ` via
1
N
N∑
i=1
I{S(Xi)≥γ} .
The Cross-Entropy Method – p. 13/37
![Page 20: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/20.jpg)
Importance Sampling (IS)
However, for small ` this requires a very large simulation effort.
The Cross-Entropy Method – p. 14/37
![Page 21: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/21.jpg)
Importance Sampling (IS)
However, for small ` this requires a very large simulation effort.
A better way is to use Importance Sampling: draw X1, . . . ,XN
from a different density g, and estimate ` via the estimator
=1
N
N∑
i=1
I{S(Xi)≥γ} W (X i) ,
where W (X) = f(X)/g(X) is called the likelihood ratio.
The Cross-Entropy Method – p. 14/37
![Page 22: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/22.jpg)
Which Change of Measure?
If we restrict ourselves to g such that X1, . . . , X5 are independentand exponentially distributed with means v1, . . . , v5, then
W (x;u,v) :=f(x;u)
f(x;v)= exp
(−
5∑
j=1
xj
(1
uj
−1
vj
)) 5∏
j=1
vj
uj
.
In this case the “change of measure” is determined by thereference vector v = (v1, . . . , v5).
The Cross-Entropy Method – p. 15/37
![Page 23: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/23.jpg)
Which Change of Measure?
If we restrict ourselves to g such that X1, . . . , X5 are independentand exponentially distributed with means v1, . . . , v5, then
W (x;u,v) :=f(x;u)
f(x;v)= exp
(−
5∑
j=1
xj
(1
uj
−1
vj
)) 5∏
j=1
vj
uj
.
In this case the “change of measure” is determined by thereference vector v = (v1, . . . , v5).
Question: How do we find the optimal v = v∗?
The Cross-Entropy Method – p. 15/37
![Page 24: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/24.jpg)
Which Change of Measure?
If we restrict ourselves to g such that X1, . . . , X5 are independentand exponentially distributed with means v1, . . . , v5, then
W (x;u,v) :=f(x;u)
f(x;v)= exp
(−
5∑
j=1
xj
(1
uj
−1
vj
)) 5∏
j=1
vj
uj
.
In this case the “change of measure” is determined by thereference vector v = (v1, . . . , v5).
Question: How do we find the optimal v = v∗?
Answer: Let CE find it adaptively!
The Cross-Entropy Method – p. 15/37
![Page 25: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/25.jpg)
CE Algorithm
1 Define v0 := u. Set t := 1 (iteration counter).
2 Update γt: Generate X1, . . . ,XN according to f(·; vt−1). Let
γt be the worst of the ρ × N best performances, provided this is
less than γ. Else γt := γ.
3 Update vt: Use the same sample to calculate, for j = 1, . . . , n,
vt,j =
∑Ni=1 I{S(Xi)≥γt}W (X i;u, vt−1)Xij∑N
i=1 I{S(Xi)≥γt}W (X i;u, vt−1).
THIS UPDATING IS BASED ON CE.
4 If γt = γ then proceed to step 5; otherwise set t := t + 1 and
reiterate from step 2.
5 Estimate ` via the LR estimator, using the final vT .
The Cross-Entropy Method – p. 16/37
![Page 26: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/26.jpg)
CE Algorithm
1 Define v0 := u. Set t := 1 (iteration counter).
2 Update γt: Generate X1, . . . ,XN according to f(·; vt−1). Let
γt be the worst of the ρ × N best performances, provided this is
less than γ. Else γt := γ.
3 Update vt: Use the same sample to calculate, for j = 1, . . . , n,
vt,j =
∑Ni=1 I{S(Xi)≥γt}W (X i;u, vt−1)Xij∑N
i=1 I{S(Xi)≥γt}W (X i;u, vt−1).
THIS UPDATING IS BASED ON CE.
4 If γt = γ then proceed to step 5; otherwise set t := t + 1 and
reiterate from step 2.
5 Estimate ` via the LR estimator, using the final vT .
The Cross-Entropy Method – p. 16/37
![Page 27: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/27.jpg)
CE Algorithm
1 Define v0 := u. Set t := 1 (iteration counter).
2 Update γt: Generate X1, . . . ,XN according to f(·; vt−1). Let
γt be the worst of the ρ × N best performances, provided this is
less than γ. Else γt := γ.
3 Update vt: Use the same sample to calculate, for j = 1, . . . , n,
vt,j =
∑Ni=1 I{S(Xi)≥γt}W (X i;u, vt−1)Xij∑N
i=1 I{S(Xi)≥γt}W (X i;u, vt−1).
THIS UPDATING IS BASED ON CE.
4 If γt = γ then proceed to step 5; otherwise set t := t + 1 and
reiterate from step 2.
5 Estimate ` via the LR estimator, using the final vT .
The Cross-Entropy Method – p. 16/37
![Page 28: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/28.jpg)
CE Algorithm
1 Define v0 := u. Set t := 1 (iteration counter).
2 Update γt: Generate X1, . . . ,XN according to f(·; vt−1). Let
γt be the worst of the ρ × N best performances, provided this is
less than γ. Else γt := γ.
3 Update vt: Use the same sample to calculate, for j = 1, . . . , n,
vt,j =
∑Ni=1 I{S(Xi)≥γt}W (X i;u, vt−1)Xij∑N
i=1 I{S(Xi)≥γt}W (X i;u, vt−1).
THIS UPDATING IS BASED ON CE.
4 If γt = γ then proceed to step 5; otherwise set t := t + 1 and
reiterate from step 2.
5 Estimate ` via the LR estimator, using the final vT .
The Cross-Entropy Method – p. 16/37
![Page 29: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/29.jpg)
CE Algorithm
1 Define v0 := u. Set t := 1 (iteration counter).
2 Update γt: Generate X1, . . . ,XN according to f(·; vt−1). Let
γt be the worst of the ρ × N best performances, provided this is
less than γ. Else γt := γ.
3 Update vt: Use the same sample to calculate, for j = 1, . . . , n,
vt,j =
∑Ni=1 I{S(Xi)≥γt}W (X i;u, vt−1)Xij∑N
i=1 I{S(Xi)≥γt}W (X i;u, vt−1).
THIS UPDATING IS BASED ON CE.
4 If γt = γ then proceed to step 5; otherwise set t := t + 1 and
reiterate from step 2.
5 Estimate ` via the LR estimator, using the final vT .
The Cross-Entropy Method – p. 16/37
![Page 30: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/30.jpg)
Example
Level: γ = 2. Fraction of best performances: ρ = 0.1. Samplesize in steps 2 – 4: N = 1000. Final sample size: N1 = 105.
t γt vt
0 0.250 0.400 0.100 0.300 0.200
1 0.575 0.513 0.718 0.122 0.474 0.335
2 1.032 0.873 1.057 0.120 0.550 0.436
3 1.502 1.221 1.419 0.121 0.707 0.533
4 1.917 1.681 1.803 0.132 0.638 0.523
5 2.000 1.692 1.901 0.129 0.712 0.564
The Cross-Entropy Method – p. 17/37
![Page 31: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/31.jpg)
Example (cont.)
The estimate was 1.34 · 10−5,
with an estimated relative error (that is, Std()/E) of 0.03.
The simulation time was only 3 seconds (1/2 second fortable).
CMC with N1 = 108 samples gave an estimate 1.30 · 10−5
with the same RE (0.03). The simulation time was 1875seconds.
With minimal effort we reduced our simulation time by afactor of 625.
The Cross-Entropy Method – p. 18/37
![Page 32: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/32.jpg)
Example (cont.)
The estimate was 1.34 · 10−5,
with an estimated relative error (that is, Std()/E) of 0.03.
The simulation time was only 3 seconds (1/2 second fortable).
CMC with N1 = 108 samples gave an estimate 1.30 · 10−5
with the same RE (0.03). The simulation time was 1875seconds.
With minimal effort we reduced our simulation time by afactor of 625.
The Cross-Entropy Method – p. 18/37
![Page 33: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/33.jpg)
Example (cont.)
The estimate was 1.34 · 10−5,
with an estimated relative error (that is, Std()/E) of 0.03.
The simulation time was only 3 seconds (1/2 second fortable).
CMC with N1 = 108 samples gave an estimate 1.30 · 10−5
with the same RE (0.03). The simulation time was 1875seconds.
With minimal effort we reduced our simulation time by afactor of 625.
The Cross-Entropy Method – p. 18/37
![Page 34: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/34.jpg)
Example (cont.)
The estimate was 1.34 · 10−5,
with an estimated relative error (that is, Std()/E) of 0.03.
The simulation time was only 3 seconds (1/2 second fortable).
CMC with N1 = 108 samples gave an estimate 1.30 · 10−5
with the same RE (0.03). The simulation time was 1875seconds.
With minimal effort we reduced our simulation time by afactor of 625.
The Cross-Entropy Method – p. 18/37
![Page 35: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/35.jpg)
Example (cont.)
The estimate was 1.34 · 10−5,
with an estimated relative error (that is, Std()/E) of 0.03.
The simulation time was only 3 seconds (1/2 second fortable).
CMC with N1 = 108 samples gave an estimate 1.30 · 10−5
with the same RE (0.03). The simulation time was 1875seconds.
With minimal effort we reduced our simulation time by afactor of 625.
The Cross-Entropy Method – p. 18/37
![Page 36: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/36.jpg)
Example 2: The Max-Cut Problem
Consider a weighted graph G with node set V = {1, . . . , n}.Partition the nodes of the graph into two subsets V1 and V2 suchthat the sum of the weights of the edges going from one subset tothe other is maximised.
Example3
4
5
62
1
The Cross-Entropy Method – p. 19/37
![Page 37: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/37.jpg)
Cost matrix:
C =
0 c12 c13 0 0 0
c21 0 c23 c24 0 0
c31 c32 0 c34 c35 0
0 c42 c43 0 c45 c46
0 0 c53 c54 0 c56
0 0 0 c64 c65 0
.
{V1, V2} = {{1, 3, 4}, {2, 5, 6}} is a possible cut. The cost of thecut is
c12 + c32 + c35 + c42 + c45 + c46.
The Cross-Entropy Method – p. 20/37
![Page 38: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/38.jpg)
Random Cut Vector
We can represent a cut via a cut vector x = (x1, . . . , xn), wherexi = 1 if node i belongs to same partition as 1, and 0 else.
We wish to maximise S(x) via the CE method.
The Cross-Entropy Method – p. 21/37
![Page 39: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/39.jpg)
Random Cut Vector
We can represent a cut via a cut vector x = (x1, . . . , xn), wherexi = 1 if node i belongs to same partition as 1, and 0 else.
For example, the cut {{1, 3, 4}, {2, 5, 6}} can be represented viathe cut vector (1, 0, 1, 1, 0, 0).
We wish to maximise S(x) via the CE method.
The Cross-Entropy Method – p. 21/37
![Page 40: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/40.jpg)
Random Cut Vector
We can represent a cut via a cut vector x = (x1, . . . , xn), wherexi = 1 if node i belongs to same partition as 1, and 0 else.
For example, the cut {{1, 3, 4}, {2, 5, 6}} can be represented viathe cut vector (1, 0, 1, 1, 0, 0).
Let X be the set of all cut vectors x = (1, x2, . . . , xn).
We wish to maximise S(x) via the CE method.
The Cross-Entropy Method – p. 21/37
![Page 41: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/41.jpg)
Random Cut Vector
We can represent a cut via a cut vector x = (x1, . . . , xn), wherexi = 1 if node i belongs to same partition as 1, and 0 else.
For example, the cut {{1, 3, 4}, {2, 5, 6}} can be represented viathe cut vector (1, 0, 1, 1, 0, 0).
Let X be the set of all cut vectors x = (1, x2, . . . , xn).
Let S(x) be the corresponding cost of the cut.
We wish to maximise S(x) via the CE method.
The Cross-Entropy Method – p. 21/37
![Page 42: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/42.jpg)
Random Cut Vector
We can represent a cut via a cut vector x = (x1, . . . , xn), wherexi = 1 if node i belongs to same partition as 1, and 0 else.
For example, the cut {{1, 3, 4}, {2, 5, 6}} can be represented viathe cut vector (1, 0, 1, 1, 0, 0).
Let X be the set of all cut vectors x = (1, x2, . . . , xn).
Let S(x) be the corresponding cost of the cut.
We wish to maximise S(x) via the CE method.
The Cross-Entropy Method – p. 21/37
![Page 43: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/43.jpg)
General CE Procedure
First, cast the original optimization problem of S(x) into anassociated rare-events estimation problem: the estimation of
` = P(S(X) ≥ γ) = EI{S(X)≥γ} .
Second, formulate a parameterized random mechanism togenerate objects X ∈ X . Then, iterate the following steps:
• Generate a random sample of objects X1, . . . ,XN ∈ X (e.g.,cut vectors).
• Update the parameters of the random mechanism (obtained viaCE minimization), in order to produce a better sample in the nextiteration.
The Cross-Entropy Method – p. 22/37
![Page 44: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/44.jpg)
General CE Procedure
First, cast the original optimization problem of S(x) into anassociated rare-events estimation problem: the estimation of
` = P(S(X) ≥ γ) = EI{S(X)≥γ} .
Second, formulate a parameterized random mechanism togenerate objects X ∈ X . Then, iterate the following steps:
• Generate a random sample of objects X1, . . . ,XN ∈ X (e.g.,cut vectors).
• Update the parameters of the random mechanism (obtained viaCE minimization), in order to produce a better sample in the nextiteration.
The Cross-Entropy Method – p. 22/37
![Page 45: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/45.jpg)
General CE Procedure
First, cast the original optimization problem of S(x) into anassociated rare-events estimation problem: the estimation of
` = P(S(X) ≥ γ) = EI{S(X)≥γ} .
Second, formulate a parameterized random mechanism togenerate objects X ∈ X . Then, iterate the following steps:
• Generate a random sample of objects X1, . . . ,XN ∈ X (e.g.,cut vectors).
• Update the parameters of the random mechanism (obtained viaCE minimization), in order to produce a better sample in the nextiteration.
The Cross-Entropy Method – p. 22/37
![Page 46: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/46.jpg)
Generation and Updating Formulas
Generation of cut vectors: The most natural and easiestway to generate the cut vectors is to let X2, . . . , Xn beindependent Bernoulli random variables with successprobabilities p2, . . . , pn.
Updating formulas: From CE minimization: the updatedprobabilities are the maximum likelihood estimates of the ρN
best samples:
pt,j =
∑N
i=1 I{S(Xi)≥γt} I{Xij=1}∑N
i=1 I{S(Xi)≥γt}
, j = 2, . . . , n .
The Cross-Entropy Method – p. 23/37
![Page 47: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/47.jpg)
Generation and Updating Formulas
Generation of cut vectors: The most natural and easiestway to generate the cut vectors is to let X2, . . . , Xn beindependent Bernoulli random variables with successprobabilities p2, . . . , pn.
Updating formulas: From CE minimization: the updatedprobabilities are the maximum likelihood estimates of the ρN
best samples:
pt,j =
∑N
i=1 I{S(Xi)≥γt} I{Xij=1}∑N
i=1 I{S(Xi)≥γt}
, j = 2, . . . , n .
The Cross-Entropy Method – p. 23/37
![Page 48: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/48.jpg)
Algorithm
1 Start with p0 = (1/2, . . . , 1/2). Let t := 1.
2 Update γt: Draw X1, . . . ,XN from Ber(pt). Let γt be
the worst performance of the ρ × 100% best
performances.
3 Update pt: Use the same sample to calculate
pt,j =
∑Ni=1 I{S(Xi)≥γt} I{Xij=1}∑N
i=1 I{S(Xi)≥γt}
,
j = 1, . . . , n, where X i = (Xi1, . . . ,Xin), and increase t
by 1.
4 If the stopping criterion is met, then stop;
otherwise set t := t + 1 and reiterate from step 2.
The Cross-Entropy Method – p. 24/37
![Page 49: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/49.jpg)
Algorithm
1 Start with p0 = (1/2, . . . , 1/2). Let t := 1.
2 Update γt: Draw X1, . . . ,XN from Ber(pt). Let γt be
the worst performance of the ρ × 100% best
performances.
3 Update pt: Use the same sample to calculate
pt,j =
∑Ni=1 I{S(Xi)≥γt} I{Xij=1}∑N
i=1 I{S(Xi)≥γt}
,
j = 1, . . . , n, where X i = (Xi1, . . . ,Xin), and increase t
by 1.
4 If the stopping criterion is met, then stop;
otherwise set t := t + 1 and reiterate from step 2.
The Cross-Entropy Method – p. 24/37
![Page 50: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/50.jpg)
Algorithm
1 Start with p0 = (1/2, . . . , 1/2). Let t := 1.
2 Update γt: Draw X1, . . . ,XN from Ber(pt). Let γt be
the worst performance of the ρ × 100% best
performances.
3 Update pt: Use the same sample to calculate
pt,j =
∑Ni=1 I{S(Xi)≥γt} I{Xij=1}∑N
i=1 I{S(Xi)≥γt}
,
j = 1, . . . , n, where X i = (Xi1, . . . ,Xin), and increase t
by 1.
4 If the stopping criterion is met, then stop;
otherwise set t := t + 1 and reiterate from step 2.
The Cross-Entropy Method – p. 24/37
![Page 51: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/51.jpg)
Algorithm
1 Start with p0 = (1/2, . . . , 1/2). Let t := 1.
2 Update γt: Draw X1, . . . ,XN from Ber(pt). Let γt be
the worst performance of the ρ × 100% best
performances.
3 Update pt: Use the same sample to calculate
pt,j =
∑Ni=1 I{S(Xi)≥γt} I{Xij=1}∑N
i=1 I{S(Xi)≥γt}
,
j = 1, . . . , n, where X i = (Xi1, . . . ,Xin), and increase t
by 1.
4 If the stopping criterion is met, then stop;
otherwise set t := t + 1 and reiterate from step 2.
The Cross-Entropy Method – p. 24/37
![Page 52: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/52.jpg)
Example
Results for the case with n = 400,m = 200 nodes are givennext.
Parameters: ρ = 0.1, N = 1000.
The CPU time was only 100 seconds (Matlab, pentium III,500 Mhz).
The CE algorithm converges quickly, yielding the exactoptimal solution 40000 in 22 iterations.
The Cross-Entropy Method – p. 25/37
![Page 53: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/53.jpg)
Example
Results for the case with n = 400,m = 200 nodes are givennext.
Parameters: ρ = 0.1, N = 1000.
The CPU time was only 100 seconds (Matlab, pentium III,500 Mhz).
The CE algorithm converges quickly, yielding the exactoptimal solution 40000 in 22 iterations.
The Cross-Entropy Method – p. 25/37
![Page 54: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/54.jpg)
Example
Results for the case with n = 400,m = 200 nodes are givennext.
Parameters: ρ = 0.1, N = 1000.
The CPU time was only 100 seconds (Matlab, pentium III,500 Mhz).
The CE algorithm converges quickly, yielding the exactoptimal solution 40000 in 22 iterations.
The Cross-Entropy Method – p. 25/37
![Page 55: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/55.jpg)
Example
Results for the case with n = 400,m = 200 nodes are givennext.
Parameters: ρ = 0.1, N = 1000.
The CPU time was only 100 seconds (Matlab, pentium III,500 Mhz).
The CE algorithm converges quickly, yielding the exactoptimal solution 40000 in 22 iterations.
The Cross-Entropy Method – p. 25/37
![Page 56: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/56.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 57: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/57.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 58: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/58.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 59: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/59.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 60: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/60.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 61: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/61.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 62: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/62.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 63: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/63.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 64: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/64.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 65: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/65.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 66: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/66.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 67: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/67.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 68: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/68.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 69: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/69.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 70: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/70.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 71: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/71.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 72: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/72.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 73: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/73.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 74: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/74.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 75: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/75.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 76: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/76.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 77: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/77.jpg)
Max-Cut
0 50 100 150 200 250 300 350 400 4500
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
The Cross-Entropy Method – p. 26/37
![Page 78: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/78.jpg)
Example: Continuous Optimization
-6 -2 2 60
0.5
1
x
S(x)
S(x) = e−(x−2)2 + 0.8 e−(x+2)2
The Cross-Entropy Method – p. 27/37
![Page 79: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/79.jpg)
Matlab Program
S = inline(’exp(-(x-2).^2) + 0.8*exp(-(x+2).^2)’);
mu = -10; sigma = 10; rho = 0.1; N = 100; eps = 1E-3;
t=0; % iteration counter
while sigma > eps
t = t+1;
x = mu + sigma*randn(N,1);
SX = S(x); % Compute the performance.
sortSX = sortrows([x SX],2);
mu = mean(sortSX((1-rho)*N:N,1));
sigma = std(sortSX((1-rho)*N:N,1));
fprintf(’%g %6.9f %6.9f %6.9f \n’, t, S(mu),mu, sigma)
The Cross-Entropy Method – p. 28/37
![Page 80: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/80.jpg)
Numerical Result
-5 0 50
0.2
0.4
0.6
0.8
1
Iteration 1
-5 0 50
0.2
0.4
0.6
0.8
1
Iteration 4
-5 0 50
0.2
0.4
0.6
0.8
1
Iteration 7
-5 0 50
0.2
0.4
0.6
0.8
1
Iteration 10
The Cross-Entropy Method – p. 29/37
![Page 81: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/81.jpg)
Cross-Entropy: Some Theory
Estimate ` := Pu(S(X) ≥ γ) = Eu I{S(X)≥γ} via
ˆ=1
N
N∑
i=1
I{S(Xi)≥γ}f(X i;u)
g(X i).
The best density (zero variance estimator!) is
g∗(x) :=I{S(x)≥γ}f(x;u)
`.
Problem: g∗ depends on the unknown `.
The Cross-Entropy Method – p. 30/37
![Page 82: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/82.jpg)
Cross-Entropy: Some Theory
Estimate ` := Pu(S(X) ≥ γ) = Eu I{S(X)≥γ} via
ˆ=1
N
N∑
i=1
I{S(Xi)≥γ}f(X i;u)
g(X i).
The best density (zero variance estimator!) is
g∗(x) :=I{S(x)≥γ}f(x;u)
`.
Problem: g∗ depends on the unknown `.
The Cross-Entropy Method – p. 30/37
![Page 83: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/83.jpg)
Cross-Entropy: Some Theory
Estimate ` := Pu(S(X) ≥ γ) = Eu I{S(X)≥γ} via
ˆ=1
N
N∑
i=1
I{S(Xi)≥γ}f(X i;u)
g(X i).
The best density (zero variance estimator!) is
g∗(x) :=I{S(x)≥γ}f(x;u)
`.
Problem: g∗ depends on the unknown `.
The Cross-Entropy Method – p. 30/37
![Page 84: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/84.jpg)
Cross-Entropy: Some Theory
Estimate ` := Pu(S(X) ≥ γ) = Eu I{S(X)≥γ} via
ˆ=1
N
N∑
i=1
I{S(Xi)≥γ}f(X i;u)
g(X i).
The best density (zero variance estimator!) is
g∗(x) :=I{S(x)≥γ}f(x;u)
`.
Problem: g∗ depends on the unknown `.
The Cross-Entropy Method – p. 30/37
![Page 85: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/85.jpg)
Cross-Entropy: Some Theory
Estimate ` := Pu(S(X) ≥ γ) = Eu I{S(X)≥γ} via
ˆ=1
N
N∑
i=1
I{S(Xi)≥γ}f(X i;u)
g(X i).
The best density (zero variance estimator!) is
g∗(x) :=I{S(x)≥γ}f(x;u)
`.
Problem: g∗ depends on the unknown `.
The Cross-Entropy Method – p. 30/37
![Page 86: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/86.jpg)
Idea: choose g = f(·;v) such that the “distance” between thedensities g∗ and f(·;v) is minimal.
The Kullback-Leibler or cross-entropy distance is definedas:
D(g, h) = Eg logg(X)
h(X)
=
∫g(x) log g(x) dx −
∫g(x) log h(x) dx .
Determine the optimal v∗ from minv D(g∗, f(·;v)) .
The Cross-Entropy Method – p. 31/37
![Page 87: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/87.jpg)
Idea: choose g = f(·;v) such that the “distance” between thedensities g∗ and f(·;v) is minimal.
The Kullback-Leibler or cross-entropy distance is definedas:
D(g, h) = Eg logg(X)
h(X)
=
∫g(x) log g(x) dx −
∫g(x) log h(x) dx .
Determine the optimal v∗ from minv D(g∗, f(·;v)) .
The Cross-Entropy Method – p. 31/37
![Page 88: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/88.jpg)
Idea: choose g = f(·;v) such that the “distance” between thedensities g∗ and f(·;v) is minimal.
The Kullback-Leibler or cross-entropy distance is definedas:
D(g, h) = Eg logg(X)
h(X)
=
∫g(x) log g(x) dx −
∫g(x) log h(x) dx .
Determine the optimal v∗ from minv D(g∗, f(·;v)) .
The Cross-Entropy Method – p. 31/37
![Page 89: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/89.jpg)
This is equivalent to solving
maxv
Eu I{S(X)≥γ} log f(X;v) .
Using again IS, we can rewrite this as
maxv
Ew I{S(X)≥γ} W (X;u,w) log f(X;v),
for any reference parameter w, where
W (x;u,w) =f(x; .u)
f(x;w)
The Cross-Entropy Method – p. 32/37
![Page 90: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/90.jpg)
We may estimate the optimal solution v∗ by solving thefollowing stochastic counterpart:
maxv
1
N
N∑
i=1
I{S(Xi)≥γ} W (X i;u,w) log f(X i;v) ,
where X1, . . . ,XN is a random sample from f(·;w).Alternatively, solve:
1
N
N∑
i=1
I{S(Xi)≥γ} W (X i;u,w)∇ log f(X i;v) = 0,
where the gradient is with respect to v.
The Cross-Entropy Method – p. 33/37
![Page 91: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/91.jpg)
The solution to the CE program can often be calculatedanalytically.
Note that the CE program is useful only when under w theevent {S(X) ≥ γ} is not too rare, say ≥ 10−5.
Answer: use a multi-level approach.
Introduce a sequence of reference parameters {vt, t ≥ 0} and asequence of levels {γt, t ≥ 1}, and iterate in both γt and vt.
The Cross-Entropy Method – p. 34/37
![Page 92: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/92.jpg)
The solution to the CE program can often be calculatedanalytically.
Note that the CE program is useful only when under w theevent {S(X) ≥ γ} is not too rare, say ≥ 10−5.
Answer: use a multi-level approach.
Introduce a sequence of reference parameters {vt, t ≥ 0} and asequence of levels {γt, t ≥ 1}, and iterate in both γt and vt.
The Cross-Entropy Method – p. 34/37
![Page 93: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/93.jpg)
The solution to the CE program can often be calculatedanalytically.
Note that the CE program is useful only when under w theevent {S(X) ≥ γ} is not too rare, say ≥ 10−5.
Question: how to choose w so that this is indeed the case?
Answer: use a multi-level approach.
Introduce a sequence of reference parameters {vt, t ≥ 0} and asequence of levels {γt, t ≥ 1}, and iterate in both γt and vt.
The Cross-Entropy Method – p. 34/37
![Page 94: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/94.jpg)
The solution to the CE program can often be calculatedanalytically.
Note that the CE program is useful only when under w theevent {S(X) ≥ γ} is not too rare, say ≥ 10−5.
Question: how to choose w so that this is indeed the case?
Answer: use a multi-level approach.
Introduce a sequence of reference parameters {vt, t ≥ 0} and asequence of levels {γt, t ≥ 1}, and iterate in both γt and vt.
The Cross-Entropy Method – p. 34/37
![Page 95: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/95.jpg)
The solution to the CE program can often be calculatedanalytically.
Note that the CE program is useful only when under w theevent {S(X) ≥ γ} is not too rare, say ≥ 10−5.
Question: how to choose w so that this is indeed the case?
Answer: use a multi-level approach.
Introduce a sequence of reference parameters {vt, t ≥ 0} and asequence of levels {γt, t ≥ 1}, and iterate in both γt and vt.
The Cross-Entropy Method – p. 34/37
![Page 96: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/96.jpg)
Toy example 1 (continued)
Recall
f(x;v) = exp
(−
5∑
j=1
xj
vj
)5∏
j=1
1
vj
.
The optimal v follows from the system of equations
N∑
i=1
I{S(Xi)≥γ} W (X i;u,w)∇ log f(X i;v) = 0.
Since∂
∂vj
log f(x;v) =xj
v2j
−1
vj
,
The Cross-Entropy Method – p. 35/37
![Page 97: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/97.jpg)
we have for the jth equation
N∑
i=1
I{S(Xi)≥γ} W (X i;u,w)
(Xij
v2j
−1
vj
)= 0 ,
whence,
vj =
∑N
i=1 I{S(Xi)≥γ}W (X i;u,w)Xij∑N
i=1 I{S(Xi)≥γ}W (X i;u,w),
which leads to the updating formula in step 3 of the Algorithm.
The Cross-Entropy Method – p. 36/37
![Page 98: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/98.jpg)
Further research
Multi-extremal constrained continuous optimization.
Noisy optimization.
Simulation with heavy tail distributions.
Incorporating MaxEnt (MinxEnt), e.g. MCE
Multi-actor games
Convergence of CE algorithm.
The Cross-Entropy Method – p. 37/37
![Page 99: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/99.jpg)
Further research
Multi-extremal constrained continuous optimization.
Noisy optimization.
Simulation with heavy tail distributions.
Incorporating MaxEnt (MinxEnt), e.g. MCE
Multi-actor games
Convergence of CE algorithm.
The Cross-Entropy Method – p. 37/37
![Page 100: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/100.jpg)
Further research
Multi-extremal constrained continuous optimization.
Noisy optimization.
Simulation with heavy tail distributions.
Incorporating MaxEnt (MinxEnt), e.g. MCE
Multi-actor games
Convergence of CE algorithm.
The Cross-Entropy Method – p. 37/37
![Page 101: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/101.jpg)
Further research
Multi-extremal constrained continuous optimization.
Noisy optimization.
Simulation with heavy tail distributions.
Incorporating MaxEnt (MinxEnt), e.g. MCE
Multi-actor games
Convergence of CE algorithm.
The Cross-Entropy Method – p. 37/37
![Page 102: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/102.jpg)
Further research
Multi-extremal constrained continuous optimization.
Noisy optimization.
Simulation with heavy tail distributions.
Incorporating MaxEnt (MinxEnt), e.g. MCE
Multi-actor games
Convergence of CE algorithm.
The Cross-Entropy Method – p. 37/37
![Page 103: The Cross-Entropy Method - University of Queenslandpeople.smp.uq.edu.au/DirkKroese/ps/tutslides.pdf · 2004-10-31 · The Cross-Entropy Method: A Unified Approach to Combinatorial](https://reader034.vdocuments.site/reader034/viewer/2022050509/5f9a1edde89f7a6d74367aad/html5/thumbnails/103.jpg)
Further research
Multi-extremal constrained continuous optimization.
Noisy optimization.
Simulation with heavy tail distributions.
Incorporating MaxEnt (MinxEnt), e.g. MCE
Multi-actor games
Convergence of CE algorithm.
The Cross-Entropy Method – p. 37/37