adaboost talk - cmpcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 presentation motivation...
TRANSCRIPT
![Page 1: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/1.jpg)
AdaBoost
Lecturer:Jan Sochman
Authors:Jan Sochman, Jirı Matas
Center for Machine PerceptionCzech Technical University, Prague
http://cmp.felk.cvut.cz
![Page 2: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/2.jpg)
2/17
Presentation
Motivation
AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)
Outline:
� AdaBoost algorithm
• How it works?
• Why it works?
� Online AdaBoost and other variants
![Page 3: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/3.jpg)
3/17
What is AdaBoost?
AdaBoost is an algorithm for constructing a “strong” classifier as linear combination
f(x) =T∑
t=1
αtht(x)
of “simple” “weak” classifiers ht(x) : X → {−1,+1}.
Terminology
� ht(x) ... “weak” or basis classifier, hypothesis, “feature”
� H(x) = sign(f(x)) ... ‘’strong” or final classifier/hypothesis
Interesting properties
� AB is capable reducing both bias (e.g. stumps) and variance (e.g. trees) of the weakclassifiers
� AB has good generalisation properties (maximises margin)
� AB output converges to the logarithm of likelihood ratio
� AB can be seen as a feature selector with a principled strategy (minimisation of upperbound on empirical error)
� AB is close to sequential decision making (it produces a sequence of gradually morecomplex classifiers)
![Page 4: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/4.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}
![Page 5: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/5.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/m
![Page 6: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/6.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
![Page 7: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/7.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
t = 1
![Page 8: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/8.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
t = 1
![Page 9: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/9.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
t = 1
![Page 10: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/10.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
where Zt is normalisation factor
t = 1
![Page 11: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/11.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
where Zt is normalisation factor
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 1
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 12: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/12.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
where Zt is normalisation factor
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 2
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 13: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/13.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
where Zt is normalisation factor
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 3
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 14: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/14.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
where Zt is normalisation factor
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 4
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 15: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/15.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
where Zt is normalisation factor
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 5
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 16: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/16.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
where Zt is normalisation factor
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 6
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 17: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/17.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
where Zt is normalisation factor
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 7
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 18: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/18.jpg)
4/17
The AdaBoost Algorithm
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
where Zt is normalisation factor
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 40
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 19: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/19.jpg)
5/17
Reweighting
Effect on the training set
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
exp(−αtyiht(xi)){
< 1, yi = ht(xi)> 1, yi 6= ht(xi)
⇒ Increase (decrease) weight of wrongly (correctly) classified examples
⇒ The weight is the upper bound on the error of a given example
⇒ All information about previously selected “features” is captured in Dt
![Page 20: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/20.jpg)
5/17
Reweighting
Effect on the training set
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
exp(−αtyiht(xi)){
< 1, yi = ht(xi)> 1, yi 6= ht(xi)
⇒ Increase (decrease) weight of wrongly (correctly) classified examples
⇒ The weight is the upper bound on the error of a given example
⇒ All information about previously selected “features” is captured in Dt
![Page 21: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/21.jpg)
5/17
Reweighting
Effect on the training set
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
exp(−αtyiht(xi)){
< 1, yi = ht(xi)> 1, yi 6= ht(xi)
⇒ Increase (decrease) weight of wrongly (correctly) classified examples
⇒ The weight is the upper bound on the error of a given example
⇒ All information about previously selected “features” is captured in Dt
![Page 22: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/22.jpg)
6/17
Upper Bound Theorem
Theorem: The following upper bound holds on the training error of H
1m|{i : H(xi) 6= yi}| ≤
T∏t=1
Zt
Proof: By unravelling the update rule
DT+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
=exp(−
∑t αtyiht(xi))
m∏
t Zt=
exp(−yif(xi))m∏
t Zt
If H(xi) 6= yi then yif(xi) ≤ 0 implying that exp(−yif(xi)) > 1, thus
JH(xi) 6= yiK ≤ exp(−yif(xi))1m
∑i
JH(xi) 6= yiK ≤ 1m
∑i
exp(−yif(xi))
=∑
i
(∏
t
Zt)DT+1(i) =∏
t
Zt −2 −1 0 1 20
0.5
1
1.5
2
2.5
3
yf(x)er
r
![Page 23: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/23.jpg)
7/17
Consequences of the Theorem
� Instead of minimising the training error, its upper bound can be minimised
� This can be done by minimising Zt in each training round by:
• Choosing optimal ht, and
• Finding optimal αt
� AdaBoost can be proved to maximise margin
� AdaBoost iteratively fits an additive logistic regression model
![Page 24: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/24.jpg)
8/17
Choosing αt
We attempt to minimise Zt =∑
i Dt(i)exp(−αtyiht(xi)):
dZ
dα= −
m∑i=1
D(i)yih(xi)e−yiαih(xi) = 0
−∑
i:yi=h(xi)
D(i)e−α +∑
i:yi 6=h(xi)
D(i)eα = 0
−e−α(1− ε) + eαε = 0
α =12
log1− ε
ε
⇒ The minimisator of the upper bound is αt = 12 log 1−εt
εt
![Page 25: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/25.jpg)
9/17
Choosing ht
Weak classifier examples
� Decision tree (or stump), Perceptron – H infinite
� Selecting the best one from given finite set H
Justification of the weighted error minimisation
Having αt = 12 log 1−εt
εt
Zt =m∑
i=1
Dt(i)e−yiαiht(xi)
=∑
i:yi=ht(xi)
Dt(i)e−αt +∑
i:yi 6=ht(xi)
Dt(i)eαt
= (1− εt)e−αt + εteαt
= 2√
εt(1− εt)
⇒ Zt is minimised by selecting ht with minimal weighted error εt
![Page 26: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/26.jpg)
10/17
Generalisation (Schapire & Singer 1999)
Maximising margins in AdaBoost
P(x,y)∼S[yf(x) ≤ θ] ≤ 2TT∏
t=1
√ε1−θt (1− εt)1+θ where f(x) =
~α · ~h(x)‖~α‖1
� Choosing ht(x) with minimal εt in each step one minimises the margin
� Margin in SVM use the L2 norm instead: (~α · ~h(x))/‖~α‖2
Upper bounds based on margin
With probability 1− δ over the random choice of the training set S
P(x,y)∼D[yf(x) ≤ 0] ≤ P(x,y)∼S[yf(x) ≤ θ] +O
1√m
(d log2(m/d)
θ2+ log(1/δ)
)1/2
where D is a distribution over X × {+1,−1}, and d is pseudodimension of H.
Problem: The upper bound is very loose. In practice AdaBoost works much better.
![Page 27: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/27.jpg)
11/17
Convergence (Friedman et al. 1998)
Proposition 1 The discrete AdaBoost algorithm minimises J(f(x)) = E(e−yf(x)) byadaptive Newton updates
Lemma J(f(x)) is minimised at
f(x) =T∑
t=1
αtht(x) =12
logP (y = 1|x)
P (y = −1|x)
Hence
P (y = 1|x) =ef(x)
e−f(x) + ef(x)and P (y = −1|x) =
e−f(x)
e−f(x) + ef(x)
Additive logistic regression model
T∑t=1
at(x) = logP (y = 1|x)
P (y = −1|x)
Proposition 2 By minimising J(f(x)) the discrete AdaBoost fits (up to a factor 2) anadditive logistic regression model
![Page 28: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/28.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}
![Page 29: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/29.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/m
![Page 30: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/30.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
![Page 31: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/31.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
t = 1
![Page 32: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/32.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
t = 1
![Page 33: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/33.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
t = 1
![Page 34: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/34.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
t = 1
![Page 35: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/35.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 1
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 36: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/36.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 2
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 37: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/37.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 3
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 38: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/38.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 4
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 39: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/39.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 5
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 40: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/40.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 6
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 41: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/41.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 7
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 42: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/42.jpg)
12/17
The Algorithm Recapitulation
Given: (x1, y1), . . . , (xm, ym);xi ∈ X , yi ∈ {−1,+1}Initialise weights D1(i) = 1/mFor t = 1, ..., T :
� Find ht = arg minhj∈H
εj =m∑
i=1
Dt(i)Jyi 6= hj(xi)K
� If εt ≥ 1/2 then stop
� Set αt = 12 log(1−εt
εt)
� Update
Dt+1(i) =Dt(i)exp(−αtyiht(xi))
Zt
Output the final classifier:
H(x) = sign
(T∑
t=1
αtht(x)
)
step
trai
nin
ger
ror
t = 40
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 43: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/43.jpg)
13/17
AdaBoost Variants
Freund & Schapire 1995
� Discrete (h : X → {0, 1})
� Multiclass AdaBoost.M1 (h : X → {0, 1, ..., k})
� Multiclass AdaBoost.M2 (h : X → [0, 1]k)
� Real valued AdaBoost.R (Y = [0, 1], h : X → [0, 1])
Schapire & Singer 1999
� Confidence rated prediction (h : X → R, two-class)
� Multilabel AdaBoost.MR, AdaBoost.MH (different formulation of minimised loss)
Oza 2001
� Online AdaBoost
Many other modifications since then: cascaded AB, WaldBoost, probabilistic boostingtree, ...
![Page 44: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/44.jpg)
14/17
Online AdaBoost
Offline
Given:
� Set of labeled training samplesX = {(x1, y1), ..., (xm, ym)|y = ±1}
� Weight distribution over XD0 = 1/m
For t = 1, . . . , T
� Train a weak classifier using samplesand weight distribution
ht(x) = L(X , Dt−1)
� Calculate error εt
� Calculate coeficient αt from εt
� Update weight distribution Dt
Output:F (x) = sign(
∑Tt=1 αtht(x))
Online
Given:
For t = 1, . . . , T
Output:F (x) = sign(
∑Tt=1 αtht(x))
![Page 45: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/45.jpg)
14/17
Online AdaBoost
Offline
Given:
� Set of labeled training samplesX = {(x1, y1), ..., (xm, ym)|y = ±1}
� Weight distribution over XD0 = 1/m
For t = 1, . . . , T
� Train a weak classifier using samplesand weight distribution
ht(x) = L(X , Dt−1)
� Calculate error εt
� Calculate coeficient αt from εt
� Update weight distribution Dt
Output:F (x) = sign(
∑Tt=1 αtht(x))
Online
Given:
For t = 1, . . . , T
Output:F (x) = sign(
∑Tt=1 αtht(x))
� One labeled training sample(x, y)|y = ±1
� Strong classifier to update
![Page 46: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/46.jpg)
14/17
Online AdaBoost
Offline
Given:
� Set of labeled training samplesX = {(x1, y1), ..., (xm, ym)|y = ±1}
� Weight distribution over XD0 = 1/m
For t = 1, . . . , T
� Train a weak classifier using samplesand weight distribution
ht(x) = L(X , Dt−1)
� Calculate error εt
� Calculate coeficient αt from εt
� Update weight distribution Dt
Output:F (x) = sign(
∑Tt=1 αtht(x))
Online
Given:
For t = 1, . . . , T
Output:F (x) = sign(
∑Tt=1 αtht(x))
� One labeled training sample(x, y)|y = ±1
� Strong classifier to update
� Initial importance λ = 1
![Page 47: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/47.jpg)
14/17
Online AdaBoost
Offline
Given:
� Set of labeled training samplesX = {(x1, y1), ..., (xm, ym)|y = ±1}
� Weight distribution over XD0 = 1/m
For t = 1, . . . , T
� Train a weak classifier using samplesand weight distribution
ht(x) = L(X , Dt−1)
� Calculate error εt
� Calculate coeficient αt from εt
� Update weight distribution Dt
Output:F (x) = sign(
∑Tt=1 αtht(x))
Online
Given:
For t = 1, . . . , T
Output:F (x) = sign(
∑Tt=1 αtht(x))
� One labeled training sample(x, y)|y = ±1
� Strong classifier to update
� Initial importance λ = 1
� Update the weak classifier using thesample and the importance
ht(x) = L(ht, (x, y), λ)
![Page 48: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/48.jpg)
14/17
Online AdaBoost
Offline
Given:
� Set of labeled training samplesX = {(x1, y1), ..., (xm, ym)|y = ±1}
� Weight distribution over XD0 = 1/m
For t = 1, . . . , T
� Train a weak classifier using samplesand weight distribution
ht(x) = L(X , Dt−1)
� Calculate error εt
� Calculate coeficient αt from εt
� Update weight distribution Dt
Output:F (x) = sign(
∑Tt=1 αtht(x))
Online
Given:
For t = 1, . . . , T
Output:F (x) = sign(
∑Tt=1 αtht(x))
� One labeled training sample(x, y)|y = ±1
� Strong classifier to update
� Initial importance λ = 1
� Update the weak classifier using thesample and the importance
ht(x) = L(ht, (x, y), λ)
� Update error estimation εt
� Update weight αt based on εt
![Page 49: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/49.jpg)
14/17
Online AdaBoost
Offline
Given:
� Set of labeled training samplesX = {(x1, y1), ..., (xm, ym)|y = ±1}
� Weight distribution over XD0 = 1/m
For t = 1, . . . , T
� Train a weak classifier using samplesand weight distribution
ht(x) = L(X , Dt−1)
� Calculate error εt
� Calculate coeficient αt from εt
� Update weight distribution Dt
Output:F (x) = sign(
∑Tt=1 αtht(x))
Online
Given:
For t = 1, . . . , T
Output:F (x) = sign(
∑Tt=1 αtht(x))
� One labeled training sample(x, y)|y = ±1
� Strong classifier to update
� Initial importance λ = 1
� Update the weak classifier using thesample and the importance
ht(x) = L(ht, (x, y), λ)
� Update error estimation εt
� Update weight αt based on εt
� Update importance weight λ
![Page 50: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/50.jpg)
14/17
Online AdaBoost
Offline
Given:
� Set of labeled training samplesX = {(x1, y1), ..., (xm, ym)|y = ±1}
� Weight distribution over XD0 = 1/m
For t = 1, . . . , T
� Train a weak classifier using samplesand weight distribution
ht(x) = L(X , Dt−1)
� Calculate error εt
� Calculate coeficient αt from εt
� Update weight distribution Dt
Output:F (x) = sign(
∑Tt=1 αtht(x))
Online
Given:
For t = 1, . . . , T
Output:F (x) = sign(
∑Tt=1 αtht(x))
� One labeled training sample(x, y)|y = ±1
� Strong classifier to update
� Initial importance λ = 1
� Update the weak classifier using thesample and the importance
ht(x) = L(ht, (x, y), λ)
� Update error estimation εt
� Update weight αt based on εt
� Update importance weight λ
![Page 51: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/51.jpg)
15/17
Online AdaBoost
Converges to offline results given the same trainingset and the number of iterations N →∞
N. Oza and S. Russell. Online Bagging and Boosting.Artificial Inteligence and Statistics, 2001.
![Page 52: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/52.jpg)
16/17
Pros and Cons of AdaBoost
Advantages
� Very simple to implement
� General learning scheme - can be used for various learning tasks
� Feature selection on very large sets of features
� Good generalisation
� Seems not to overfit in practice (probably due to margin maximisation)
Disadvantages
� Suboptimal solution (greedy learning)
![Page 53: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/53.jpg)
17/17
Selected references
� Y. Freund, R.E. Schapire. A Decision-theoretic Generalization of On-line Learningand an Application to Boosting. Journal of Computer and System Sciences. 1997
� R.E. Schapire, Y. Freund, P. Bartlett, W.S. Lee. Boosting the Margin: A NewExplanation for the Effectiveness of Voting Methods. The Annals of Statistics,1998
� R.E. Schapire, Y. Singer. Improved Boosting Algorithms Using Confidence-ratedPredictions. Machine Learning. 1999
� J. Friedman, T. Hastie, R. Tibshirani. Additive Logistic Regression: a StatisticalView of Boosting. Technical report. 1998
� N.C. Oza. Online Ensemble Learning. PhD thesis. 2001
� http://www.boosting.org
![Page 54: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/54.jpg)
17/17
Selected references
� Y. Freund, R.E. Schapire. A Decision-theoretic Generalization of On-line Learningand an Application to Boosting. Journal of Computer and System Sciences. 1997
� R.E. Schapire, Y. Freund, P. Bartlett, W.S. Lee. Boosting the Margin: A NewExplanation for the Effectiveness of Voting Methods. The Annals of Statistics,1998
� R.E. Schapire, Y. Singer. Improved Boosting Algorithms Using Confidence-ratedPredictions. Machine Learning. 1999
� J. Friedman, T. Hastie, R. Tibshirani. Additive Logistic Regression: a StatisticalView of Boosting. Technical report. 1998
� N.C. Oza. Online Ensemble Learning. PhD thesis. 2001
� http://www.boosting.org
Thank you for attention
![Page 55: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/55.jpg)
![Page 56: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/56.jpg)
![Page 57: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/57.jpg)
![Page 58: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/58.jpg)
![Page 59: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/59.jpg)
![Page 60: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/60.jpg)
![Page 61: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/61.jpg)
![Page 62: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/62.jpg)
![Page 63: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/63.jpg)
![Page 64: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/64.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 65: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/65.jpg)
![Page 66: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/66.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 67: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/67.jpg)
![Page 68: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/68.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 69: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/69.jpg)
![Page 70: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/70.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 71: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/71.jpg)
![Page 72: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/72.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 73: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/73.jpg)
![Page 74: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/74.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 75: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/75.jpg)
![Page 76: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/76.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 77: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/77.jpg)
![Page 78: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/78.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 79: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/79.jpg)
![Page 80: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/80.jpg)
![Page 81: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/81.jpg)
![Page 82: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/82.jpg)
![Page 83: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/83.jpg)
![Page 84: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/84.jpg)
![Page 85: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/85.jpg)
![Page 86: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/86.jpg)
![Page 87: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/87.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 88: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/88.jpg)
![Page 89: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/89.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 90: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/90.jpg)
![Page 91: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/91.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 92: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/92.jpg)
![Page 93: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/93.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 94: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/94.jpg)
![Page 95: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/95.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 96: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/96.jpg)
![Page 97: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/97.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 98: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/98.jpg)
![Page 99: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/99.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 100: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/100.jpg)
![Page 101: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/101.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 102: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/102.jpg)
![Page 103: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/103.jpg)
![Page 104: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/104.jpg)
![Page 105: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/105.jpg)
![Page 106: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/106.jpg)
![Page 107: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/107.jpg)
![Page 108: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/108.jpg)
−2 −1 0 1 20
0.5
1
1.5
2
2.5
3
yf(x)
err
![Page 109: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/109.jpg)
![Page 110: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/110.jpg)
![Page 111: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/111.jpg)
![Page 112: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/112.jpg)
![Page 113: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/113.jpg)
![Page 114: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/114.jpg)
![Page 115: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/115.jpg)
![Page 116: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/116.jpg)
![Page 117: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/117.jpg)
![Page 118: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/118.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 119: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/119.jpg)
![Page 120: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/120.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 121: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/121.jpg)
![Page 122: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/122.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 123: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/123.jpg)
![Page 124: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/124.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 125: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/125.jpg)
![Page 126: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/126.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 127: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/127.jpg)
![Page 128: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/128.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 129: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/129.jpg)
![Page 130: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/130.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 131: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/131.jpg)
![Page 132: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/132.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 133: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/133.jpg)
![Page 134: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/134.jpg)
![Page 135: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/135.jpg)
![Page 136: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/136.jpg)
![Page 137: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/137.jpg)
![Page 138: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/138.jpg)
![Page 139: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/139.jpg)
![Page 140: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/140.jpg)
![Page 141: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/141.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 142: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/142.jpg)
![Page 143: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/143.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 144: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/144.jpg)
![Page 145: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/145.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 146: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/146.jpg)
![Page 147: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/147.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 148: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/148.jpg)
![Page 149: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/149.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 150: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/150.jpg)
![Page 151: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/151.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 152: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/152.jpg)
![Page 153: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/153.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35
![Page 154: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/154.jpg)
![Page 155: AdaBoost Talk - CMPcmp.felk.cvut.cz/~sochmj1/adaboost_talk.pdf · 2/17 Presentation Motivation AdaBoost with trees is the best off-the-shelf classifier in the world. (Breiman 1998)](https://reader033.vdocuments.site/reader033/viewer/2022051407/5ae63d3c7f8b9acc268d1dc2/html5/thumbnails/155.jpg)
0 5 10 15 20 25 30 35 400
0.05
0.1
0.15
0.2
0.25
0.3
0.35