a new guaranteed adaptive trapezoidal rule...
TRANSCRIPT
Motivation New algorithm integral Computational Cost of integral Discussion References
A New GuaranteedAdaptive Trapezoidal Rule Algorithm
Fred J. Hickernell
Department of Applied Mathematics, Illinois Institute of [email protected] www.iit.edu/~hickernell
Joint work with Martha Razo (IIT BS student) andSunny Yun (Stevenson High School 2014 graduate)
Supported by NSF-DMS-1115392
February 18, 2015
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 1 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
We Need Adaptive Algorithms
We Need Adaptive Numerical Algorithms
I We rely on the numerical software to solve mathematical and statisticalproblems: the NAG library (The Numerical Algorithms Group, 2013),MATLAB (The MathWorks, 2014), Mathematica, (Wolfram Research Inc.2014), and R (R Development Core Team, 2014).
I Functions like cos and erf give us the answer with the desired accuracyautomatically.
I Many numerical algorithms that we use are adaptive: MATLAB’s integral,fminbnd, and ode45, and the Chebfun MATLAB toolbox (Hale et al., 2014).They determine how much effort is needed to satisfy the error tolerance.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 2 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
We Need Adaptive Algorithms
We Need Better Adaptive Numerical Algorithms
Most adaptive algorithms use heuristics. There are no guarantees that theyactually do what they claim. Exceptions are
I guaranteed algorithms for finding one zero of a function and for findingminima of unimodal functions that date from the early 1970s (Brent, 2013),
I guaranteed adaptive multivariate integration algorithms using Monte Carlo(Hickernell et al., 2014) and quasi-Monte Carlo methods (Hickernell andJimenez Rugama, 2014; Jimenez Rugama and Hickernell, 2014), and
I guaranteed adaptive algorithms for univariate function approximation (Clancyet al., 2014) and optimization of multimodal univariate functions (Tong,2014) using linear splines, and
I a guaranteed adaptive trapezoidal rule for univariate integration (Clancy etal., 2014).
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 3 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
We Need Adaptive Algorithms
We Need a Better Adaptive Trapezoidal Rule
Tn(f) :=b− a2n
[f(t0) + 2f(t1) + · · ·+ 2f(tn−1) + f(tn)],
ti = a+i(b− a)
n, i = 0, . . . , n, n ∈ N := {1, 2, . . .}.
err(f, n) :=
∣∣∣∣∣∫ b
a
f(x) dx− Tn(f)
∣∣∣∣∣ ≤ (b− a)2 Var(f ′)
8n2=: err(f, n), n ∈ N.
The adaptive trapezoidal rule in (Clancy et al., 2014) takes [a, b] = [0, 1] andworks for integrands in
C :=
{f ∈ V : Var(f ′) ≤ τ
∫ 1
0
|f ′(x)− f(1) + f(0)| dx
}where 1/τ ≈ the width of the spike that you want to capture. The computationalcost to ensure that err(f, n) ≤ ε is
≤√τ Var(f ′)/(4ε) + τ + 4
As τ increases there is an additive and a multiplicative penalty. We want toremove the latter.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 4 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Three Algorithms
Three Algorithms
err(f, n) :=
∣∣∣∣∣∫ b
a
f(x) dx− Tn(f)
∣∣∣∣∣ ≤ (b− a)2 Var(f ′)
8n2=: err(f, n), n ∈ N.
ballint Taught in calculus courses. Uses(b− a)2σ
8n2to bound err(f, n).
Non-adpative. Works for integrands in Bσ := {f : Var(f ′) ≤ σ}.
flawint Taught in numerical analysis courses. Uses
err(f, n) :=
∣∣Tn(f)− Tn/2(f)∣∣
3to estimate err(f, n). Adaptive.
Bad idea according to James Lyness (1983). Works for what kindof integrands?
integral Our new algorithm. Adaptive. Need not know Var(f ′) but need toknow the spikyness of f . Details to follow.
Disclaimer: we are not pursuing interval arithmetic approaches (Rump, 1999; Moore et al., 2009;
Rump, 2010).
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 5 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Four Typical Integrands
An Easy Integrand
Algorithm feasy fbig ffluky fspiky
ballint 3 7 7 7flawint 3 3 7 7integral 3 3 3 7
feasy(x) =
√2
πe−2x
2
∫ 1
0
feasy(x) dx = 0.4772
T4(f) = 0.4750
Var(f ′easy) = 1.5038x
0 0.25 0.5 0.75 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8 feasy
T4(feasy)
err(feasy, 4) = 0.0022 ≤ 0.0117 = err(feasy, 4)
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 6 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Four Typical Integrands
A Big Integrand
Algorithm feasy fbig ffluky fspiky
ballint 3 7 7 7flawint 3 3 7 7integral 3 3 3 7
fbig(x;m) := 1
+15m4
2
[1
30− x2(1− x)2
]∫ 1
0
fbig(x;m) dx = 1
Tn(fbig(x;m)) = 1 +m4
4n4
Var(f ′big(x;m)) =10m4
√3
x0 0.2 0.4 0.6 0.8 1
fbig(x;16)
×104
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
err(fbig(x;m), n) =m4
4n4≤ 5m4
4n4= err(fbig(x;m), n)
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 7 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Four Typical Integrands
A Fluky Integrand (Inspired by Lyness (1983))
Algorithm feasy fbig ffluky fspiky
ballint 3 7 7 7flawint 3 3 7 7integral 3 3 3 7
ffluky(x;m) := fbig(x;m)
+15m2
2
[−1
6+ x(1− x)
]∫ 1
0
ffluky(x;m) dx = 1
Tn(ffluky(x;m)) = 1 +m2(m2 − 5n2)
4n4x
0 0.2 0.4 0.6 0.8 1
ffluky(x;16)
×104
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
err(ffluky(·;n), n) = 1 > 0 = err(ffluky(·;n), n)
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 8 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Four Typical Integrands
A Spiky Integrand
Algorithm feasy fbig ffluky fspiky
ballint 3 7 7 7flawint 3 3 7 7integral 3 3 3 7
fspiky(x;m)
= 30[{mx}(1− {mx})]2
{x} := x mod 1∫ 1
0
fspiky(x;m) dx = 1
Var(f ′spiky(·;m)) =40m2
√3
Tn(fspiky(·;m)) = 0 form
n∈ N
x0 0.2 0.4 0.6 0.8 1
fspiky(x;16)
0
0.5
1
1.5
2
err(fspiky(·;m), n) = 1 > 0 = err(fspiky(·;m), n) form
n∈ N.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 9 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Cone of Integrands
For Which f Can Var(f ′) Be Well Appproximated?
Var(f ′) := sup
{n∑i=1
|f ′(xi)− f ′(xi−1)| : {xi}ni=0 is a partition, n ∈ N
}partition: a = x0 ≤ x1 ≤ · · · ≤ xn = b, size({xi}ni=0) := max
i=1,...,n(xi − xi−1)
f ′(x) := f(x+), a ≤ x < b, f ′(b) := f(b−), V = {f : Var(f ′) <∞}
Define an approximation to Var(f ′) as follows:
V (f ′, {xi}ni=0, {∆i}n−1i=1 ) :=
n−1∑i=2
|∆i −∆i−1| ≤ Var(f ′),
∆i between f ′(x−i ) and f ′(x+i )
Define the cone of integrands for which V (f ′, {xi}ni=0, {∆i}n−1i=1 ) does notunderestimate Var(f ′) by much:
C := {f ∈ V : Var(f ′) ≤ C(size({xi}ni=0))V (f ′, {xi}ni=0, {∆i}n−1i=1 )
for all n ∈ N, {∆i}n−1i=1 , and {xi}ni=0 with size({xi}ni=0) < h}
Cut-off h ∈ (0, b− a] and inflation factor C : [0, h)→ [1,∞) non-decreasing.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 10 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Cone of Integrands
For Which f Can Var(f ′) Be Well Appproximated?
Var(f ′) := sup
{n∑i=1
|f ′(xi)− f ′(xi−1)| : {xi}ni=0 is a partition, n ∈ N
}
Define an approximation to Var(f ′) as follows:
V (f ′, {xi}ni=0, {∆i}n−1i=1 ) :=
n−1∑i=2
|∆i −∆i−1| ≤ Var(f ′),
∆i between f ′(x−i ) and f ′(x+i )
Define the cone of integrands for which V (f ′, {xi}ni=0, {∆i}n−1i=1 ) does notunderestimate Var(f ′) by much:
C := {f ∈ V : Var(f ′) ≤ C(size({xi}ni=0))V (f ′, {xi}ni=0, {∆i}n−1i=1 )
for all n ∈ N, {∆i}n−1i=1 , and {xi}ni=0 with size({xi}ni=0) < h}
Cut-off h ∈ (0, b− a] and inflation factor C : [0, h)→ [1,∞) non-decreasing.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 10 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Cone of Integrands
For Which f Can Var(f ′) Be Well Appproximated?
Var(f ′) := sup
{n∑i=1
|f ′(xi)− f ′(xi−1)| : {xi}ni=0 is a partition, n ∈ N
}
Define an approximation to Var(f ′) as follows:
V (f ′, {xi}ni=0, {∆i}n−1i=1 ) :=
n−1∑i=2
|∆i −∆i−1| ≤ Var(f ′),
∆i between f ′(x−i ) and f ′(x+i )
Define the cone of integrands for which V (f ′, {xi}ni=0, {∆i}n−1i=1 ) does notunderestimate Var(f ′) by much:
C := {f ∈ V : Var(f ′) ≤ C(size({xi}ni=0))V (f ′, {xi}ni=0, {∆i}n−1i=1 )
for all n ∈ N, {∆i}n−1i=1 , and {xi}ni=0 with size({xi}ni=0) < h}
Cut-off h ∈ (0, b− a] and inflation factor C : [0, h)→ [1,∞) non-decreasing.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 10 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Cone of Integrands
How Spiky Can f Be?
Var(f ′) := sup
{n∑i=1
|f ′(xi)− f ′(xi−1)| : {xi}ni=0 is a partition, n ∈ N
}V (f ′, {xi}ni=0, {∆i}n−1i=1 ) :=
n−1∑i=2
|∆i −∆i−1| ≤ Var(f ′), ∆i btwn f ′(x±i )
C := {f ∈ V : Var(f ′) ≤ C(h)V (f ′, {xi}ni=0, {∆i}n−1i=1 ), h = size({xi}ni=0) < h}
x
0 0.2 0.4 0.6 0.8 1
peak(x,0.25,0.2)
-0.05
0
0.05
0.1
0.15
0.2
0.25
x
0 0.2 0.4 0.6 0.8 1
twopk(x,0.65,0.1,+)
-0.05
0
0.05
0.1
0.15
0.2
0.25
peak(x, t, h) := (h− |x− t|)+∈ C if h ≥ h, a+ h ≤ t ≤ b− 3h
twopk(x, t, h,±) := peak(x, 0, h)
± 3[C(h)− 1]
4peak(x, t, h) ∈ C
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 11 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Practically Bounding Var(f′)
Practically Bounding Var(f ′)
Var(f ′) := sup
{n∑i=1
|f ′(xi)− f ′(xi−1)| : {xi}ni=0 is a partition, n ∈ N
}
V (f ′, {xi}ni=0, {∆i}n−1i=1 ) :=
n−1∑i=2
|∆i −∆i−1| ≤ Var(f ′), ∆i btwn f ′(x±i )
C := {f ∈ V : Var(f ′) ≤ C(h)V (f ′, {xi}ni=0, {∆i}n−1i=1 ), h = size({xi}ni=0) < h}
But V relies on derivative values.
In practice we may use
Vn(f) :=n
b− a
n−1∑i=1
|f(ti+1)− 2f(ti) + f(ti−1)| , ti = a+i(b− a)
n
= V (f ′, {xi}n+1i=0 , {∆i}ni=1) for some {xi}n+1
i=0 , {∆i}ni=1
So Vn(f) ≤ Var(f ′) ≤ C(2(b− a)/n)Vn(f) for n > 2(b− a)/h and f ∈ C.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 12 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Practically Bounding Var(f′)
Practically Bounding Var(f ′)
Var(f ′) := sup
{n∑i=1
|f ′(xi)− f ′(xi−1)| : {xi}ni=0 is a partition, n ∈ N
}
V (f ′, {xi}ni=0, {∆i}n−1i=1 ) :=
n−1∑i=2
|∆i −∆i−1| ≤ Var(f ′), ∆i btwn f ′(x±i )
C := {f ∈ V : Var(f ′) ≤ C(h)V (f ′, {xi}ni=0, {∆i}n−1i=1 ), h = size({xi}ni=0) < h}
But V relies on derivative values. In practice we may use
Vn(f) :=n
b− a
n−1∑i=1
|f(ti+1)− 2f(ti) + f(ti−1)| , ti = a+i(b− a)
n
= V (f ′, {xi}n+1i=0 , {∆i}ni=1) for some {xi}n+1
i=0 , {∆i}ni=1
So Vn(f) ≤ Var(f ′) ≤ C(2(b− a)/n)Vn(f) for n > 2(b− a)/h and f ∈ C.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 12 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Guranteed Adaptive Algorithm integral
New, Guaranteed Adaptive Algorithm integral
Given an interval, [a, b], an inflation function, C, a positive key mesh size, h, and a
positive error tolerance, ε, set j = 1, n1 =
⌊2(b− a)
h
⌋+ 1, and V 0 =∞.
Step 1 Compute Vnj(f) and V j = min
(V j−1,C
(2(b− a)
nj
)Vnj
(f)
). If
Vnj(f) > V j , then widen C and repeat this step. Otherwise, proceed.
Step 2 If (b− a)2V j ≤ 8n2jε, then return Tnj (f) as the answer.
Step 3 Otherwise, increase the number of trapezoids to nj+1 = max(2,m)nj ,where
m = min{r ∈ N : η(rnj)Vnj(f) ≤ ε},
with η(n) :=(b− a)2C(2(b− a)/n)
8n2,
increase j by one, and go to Step 1.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 13 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Guranteed Adaptive Algorithm integral
integral Works as Advertised
Theorem
Algorithm integral is successful, i.e.,∣∣∣∣∣∫ b
a
f(x) dx− integral(f, a, b, ε)
∣∣∣∣∣ ≤ ε ∀f ∈ C.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 14 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Bounds on the Computational Cost of integral
Bounds on the Computational Cost of integral
Theorem
Let N(f, ε) denote the final number of trapezoids that is required byintegral(f, a, b, ε). Then this number is bounded below and above in terms ofthe true, yet unknown, Var(f ′).
max
(⌊2(b− a)
h
⌋+ 1,
⌈(b− a)
√Var(f ′)
8ε
⌉)≤ N(f, ε)
≤ 2 min0<α≤1
max
(⌊2(b− a)
αh
⌋+ 1,
⌈(b− a)
√C(αh) Var(f ′)
8ε
⌉).
The number of function values required by integral(f, a, b, ε) is N(f, ε) + 1.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 15 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Bounds on the Computational Cost of integral
Proof of Lower Bound on Computational Cost
The number of trapezoids must be at least n1 =
⌊2(b− a)
h
⌋+ 1.
The number of trapezoids is increased until (b− a)2V j ≤ 8n2jε, which implies that
(b− a)2 Var(f ′)
8n2j≤ (b− a)2V j
8n2j≤ ε.
This implies the lower bound on N(f, ε).
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 16 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Bounds on the Computational Cost of integral
Proof of Upper Bound on Computational Cost
Let J be the value of j for which integral terminates, so N(f, ε) = nJ . Sincen1 satisfies the upper bound, we may assume that J ≥ 2.
Let m∗ = max(2,m) where m comes from Step 3. Note thatη((m∗ − 1)nJ−1) Var(f ′) > ε. For m∗ = 2 this follows because
η(nJ−1) Var(f ′) ≥(b− a)2C(2(b− a)/nJ−1)VnJ−1
(f)
8n2j−1
≥ (b− a)2V J−1(f)
8n2j−1> ε.
For m∗ = m > 2 this follows by the definition of m in Step 3.
Since η is a decreasing function, this implies that
(m∗ − 1)nJ−1 < n∗ := min
{n ∈ N : n ≥
⌊2(b− a)
h
⌋+ 1, η(n) Var(f ′) ≤ ε
}.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 17 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Bounds on the Computational Cost of integral
Proof of Upper Bound on Computational Cost cont’d
Since
(m∗ − 1)nJ−1 < n∗ := min
{n ∈ N : n ≥
⌊2(b− a)
h
⌋+ 1, η(n) Var(f ′) ≤ ε
}.
it follows that
nJ = m∗nJ−1 <m∗
m∗ − 1n∗ ≤ 2n∗.
Now we need an upper bound on n∗.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 18 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Bounds on the Computational Cost of integral
Proof of Upper Bound on Computational Cost cont’d
So far we have
N(f, ε) ≤ 2n∗, n∗ := min
{n ∈ N : n ≥
⌊2(b− a)
h
⌋+ 1, η(n) Var(f ′) ≤ ε
}.
For fixed α ∈ (0, 1], we need only consider the case where n∗ >
⌊2(b− a)
αh
⌋+ 1,
so n∗ − 1 ≥⌊
2(b− a)
αh
⌋+ 1 >
2(b− a)
αh. Then
n∗ − 1 < (n∗ − 1)
√η(n∗ − 1) Var(f ′)
ε
= (n∗ − 1)
√(b− a)2C(2(b− a)/(n∗ − 1)) Var(f ′)
8(n∗ − 1)2ε
≤ (b− a)
√C(αh) Var(f ′)
8ε,
which completes the proof of the upper bound on n∗[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 19 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Lower Complexity Bound for Integration on C
Lower Complexity Bound for Integration on C
Theorem
Let int be any (possibly adaptive) algorithm that succeeds for all integrands in C,and only uses function values. For any error tolerance ε > 0 and any arbitraryvalue of Var(f ′), there will be some f ∈ C for which int must use at least
−3
2+ (b− a− 3h)
√[C(0)− 1] Var(f ′)
32ε
function values. As Var(f ′)/ε→∞ the asymptotic rate of increase is the same asthe computational cost of integral, provided C(0) > 1.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 20 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Lower Complexity Bound for Integration on C
Proof of Lower Bound on Complexity
Suppose that int(·, a, b, ε) evaluates α peak(·; 0, h) at n nodes with
a+ 3h = x0 ≤ x1 ≤ · · · ≤ xm ≤ xm+1 = b− h, h :=b− a− 3h
2n+ 3, m ≤ n.
There must be at least one of these xi with i = 0, . . . ,m for which
xi+1 − xi2
≥ xm+1 − x02(m+ 1)
≥ xm+1 − x02n+ 2
=b− a− 3h− h
2n+ 2=b− a− 3h
2n+ 3= h.
Choose one such xi, and call it t.
int(·, a, b, ε) cannot distinguish between α peak(·; 0, h) and α twopk(·; t, h,±).
Since they all belong to C, int is successful for them all.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 21 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Lower Complexity Bound for Integration on C
Proof of Lower Bound on Complexity cont’dBy the definitions of peak and twopk
ε ≥ 1
2
[∣∣∣∣∣∫ b
a
α twopk(x; t, h,−) dx− int(α twopk(·; t, h,−), a, b, ε)
∣∣∣∣∣+
∣∣∣∣∣∫ b
a
α twopk(x; t, h,+) dx− int(α twopk(·; t, h,+), a, b, ε)
∣∣∣∣∣]
≥ 1
2
[∣∣∣∣∣int(α peak(·; 0, h), a, b, ε)−∫ b
a
α twopk(x; t, h,−) dx
∣∣∣∣∣+
∣∣∣∣∣∫ b
a
α twopk(x; t, h,+) dx− int(α peak(·; 0, h), a, b, ε)
∣∣∣∣∣]
≥ 1
2
∣∣∣∣∣∫ b
a
α twopk(x; t, h,+) dx−∫ b
a
α twopk(x; t, h,−) dx
∣∣∣∣∣=
∫ b
a
α peak(x; t, h) dx =3α[C(h)− 1]h2
8=
[C(h)− 1]h2 Var(α peak(·; 0, h))
8
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 22 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Lower Complexity Bound for Integration on C
Proof of Lower Bound on Complexity cont’d
ε ≥ [C(h)− 1]h2 Var(α peak(·; 0, h))
8
Substituting for h in terms of n gives a lower bound on n:
2n+ 3 =b− a− 3h
h≥ (b− a− 3h)
√[C(h)− 1] Var(α peak′(·; 0, h))
8ε
≥ (b− a− 3h)
√[C(0)− 1] Var(α peak′(·; 0, h))
8ε.
Since α is an arbitrary positive number, the value of Var(α peak′(·; 0, h)) isarbitrary as well.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 23 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Why Is This New and Improved?
Why Is Our New integral Improved?
I ballint is non-adaptive and requires σ = maxf Var(f ′), which may beaffected by both the vertical and horizontal scales of f .
I flawint has a flawed error estimate as pointed out by Lyness (1983).
I Clancy Et al.’s (2014) adaptive quadrature rule has a cost of
≤√τ Var(f ′)/(4ε) + τ + 4
which goes up multiplicatively in τ as τ increases.
I Our new adaptive quadrature algorithm
≤ 2 min0<α≤1
max
(⌊2(b− a)
αh
⌋+ 1, (b− a)
√C(αh) Var(f ′)
8ε+ 1
)
which goes up additively in 1/h as h→ 0.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 24 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Why Is This New and Improved?
Old integral g.m Vs. New integralNoPenalty g.m
0 0.2 0.4 0.6 0.8 10
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
0 0.2 0.4 0.6 0.8 10
10
20
30
40
50
60
70
80
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 25 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Why Is This New and Improved?
Old integral g.m Vs. New integralNoPenalty g.m
>> NewOldIntegral
Ordinary peaky function
Old integral_g
Elapsed time is 0.169305 seconds.
Tol = 1e-10, Error = 3.3784e-13, ErrEst = 2.5001e-11
Npts = 1719037
New integralNoPenalty_g
Elapsed time is 0.013951 seconds.
Tol = 1e-10, Error = 3.7074e-11, ErrEst = 8.346e-11
Npts = 164242
But should use = [75%, 91%] Npts if we knew Var(f’)
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 26 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Why Is This New and Improved?
Old integral g.m Vs. New integralNoPenalty g.m
>> NewOldIntegral
Very peaky function
nlo=1e4; nhi=nlo;
Old integral_g
Elapsed time is 0.427659 seconds.
Tol = 1e-08, Error = 1.5667e-12, ErrEst = 9.9963e-09
Npts = 6129388
New integralNoPenalty_g
Elapsed time is 0.075723 seconds.
Tol = 1e-08, Error = 8.8407e-11, ErrEst = 8.2902e-09
Npts = 1169884
But should use = [74%, 91%] Npts if we knew Var(f’)
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 27 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
What Comes Next?
What Comes Next?
I Simpson’s rule
I Relative error
I Other problems
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 28 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
What Comes Next?
References I
Brent, R. P. 2013. Algorithms for minimization without derivatives, Dover Publications, Inc.,Mineola, NY. republication of the 1973 edition by Prentice-Hall, Inc.
Clancy, N., Y. Ding, C. Hamilton, F. J. Hickernell, and Y. Zhang. 2014. The cost ofdeterministic, adaptive, automatic algorithms: Cones, not balls, J. Complexity 30, 21–45.
Hale, N., L. N. Trefethen, and T. A. Driscoll. 2014. Chebfun version 5.
Hickernell, F. J., L. Jiang, Y. Liu, and A. B. Owen. 2014. Guaranteed conservative fixed widthconfidence intervals via Monte Carlo sampling, Monte Carlo and quasi-Monte Carlo methods2012, pp. 105–128.
Hickernell, F. J. and Ll. A. Jimenez Rugama. 2014. Reliable adaptive cubature using digitalsequences. submitted for publication, arXiv:1410.8615 [math.NA].
Jimenez Rugama, Ll. A. and F. J. Hickernell. 2014. Adaptive multidimensional integrationbased on rank-1 lattices. submitted for publication, arXiv:1411.1966.
Lyness, J. N. 1983. When not to use an automatic quadrature routine, SIAM Rev. 25, 63–87.
Moore, R. E., R. B. Kearfott, and M. J. Cloud. 2009. Introduction to interval analysis,Cambridge University Press, Cambridge.
R Development Core Team. 2014. The R Project for Statistical Computing.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 29 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
What Comes Next?
References II
Rump, S. M. 1999. INTLAB - INTerval LABoratory, Developments in Reliable Computing,pp. 77–104. http://www.ti3.tuhh.de/rump/.
. 2010. Verification methods: Rigorous results using floating-point arithmetic, ActaNumer. 19, 287–449.
The MathWorks, Inc. 2014. MATLAB 8.4, Natick, MA.
The Numerical Algorithms Group. 2013. The NAG library, Mark 23, Oxford.
Tong, X. 2014. A guaranteed, adaptive, automatic algorithm for univariate functionminimization, Master’s Thesis.
Wolfram Research Inc. 2014. Mathematica 10, Champaign, IL.
[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 30 / 30