rates of covergence and newton's methodburke/crs/408/... · 2012-01-31 · outlinerates of...
TRANSCRIPT
![Page 1: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/1.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Covergenceand
Newton’s Method
Rates of Covergence and Newton’s Method
![Page 2: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/2.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence
Newton’s Method
Rates of Covergence and Newton’s Method
![Page 3: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/3.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence
We compare the performance of algorithms by their rate ofconvergence.
That is, if xk → x̄ , we are interested in how fast this happens.
We consider only quotient rates, or Q-rates of convergence.
Rates of Covergence and Newton’s Method
![Page 4: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/4.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence
We compare the performance of algorithms by their rate ofconvergence.
That is, if xk → x̄ , we are interested in how fast this happens.
We consider only quotient rates, or Q-rates of convergence.
Rates of Covergence and Newton’s Method
![Page 5: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/5.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence
We compare the performance of algorithms by their rate ofconvergence.
That is, if xk → x̄ , we are interested in how fast this happens.
We consider only quotient rates, or Q-rates of convergence.
Rates of Covergence and Newton’s Method
![Page 6: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/6.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence
Let {xν} ⊂ Rn and x̄ ∈ Rn be such that x̄ν → x̄ .
We say that x̄ν → x̄ at a linear rate if
lim supν→∞
‖xν+1 − x̄‖‖xν − x̄‖
< 1 .
The convergence is said to be superlinear if this limsup is 0.
The convergence is said to be quadratic if
lim supν→∞
‖xν+1 − x̄‖‖xν − x̄‖2
<∞ .
Rates of Covergence and Newton’s Method
![Page 7: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/7.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence
Let {xν} ⊂ Rn and x̄ ∈ Rn be such that x̄ν → x̄ .
We say that x̄ν → x̄ at a linear rate if
lim supν→∞
‖xν+1 − x̄‖‖xν − x̄‖
< 1 .
The convergence is said to be superlinear if this limsup is 0.
The convergence is said to be quadratic if
lim supν→∞
‖xν+1 − x̄‖‖xν − x̄‖2
<∞ .
Rates of Covergence and Newton’s Method
![Page 8: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/8.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence
Let {xν} ⊂ Rn and x̄ ∈ Rn be such that x̄ν → x̄ .
We say that x̄ν → x̄ at a linear rate if
lim supν→∞
‖xν+1 − x̄‖‖xν − x̄‖
< 1 .
The convergence is said to be superlinear if this limsup is 0.
The convergence is said to be quadratic if
lim supν→∞
‖xν+1 − x̄‖‖xν − x̄‖2
<∞ .
Rates of Covergence and Newton’s Method
![Page 9: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/9.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence
Let {xν} ⊂ Rn and x̄ ∈ Rn be such that x̄ν → x̄ .
We say that x̄ν → x̄ at a linear rate if
lim supν→∞
‖xν+1 − x̄‖‖xν − x̄‖
< 1 .
The convergence is said to be superlinear if this limsup is 0.
The convergence is said to be quadratic if
lim supν→∞
‖xν+1 − x̄‖‖xν − x̄‖2
<∞ .
Rates of Covergence and Newton’s Method
![Page 10: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/10.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence: Example
Let γ ∈ (0, 1).{γn} converges linearly to zero, but not superlinearly.
{γn2} converges superlinearly to 0, but not quadratically.
{γ2n} converges quadratically to zero.
Superlinear convergence is much faster than linear convergences,but quadratic convergence is much, much faster than superlinearconvergence.
γ =1
2gives γn = 2−n, γn
2= 2−n
2, γ2
n= 2−2
n
Rates of Covergence and Newton’s Method
![Page 11: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/11.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence: Example
Let γ ∈ (0, 1).{γn} converges linearly to zero, but not superlinearly.
{γn2} converges superlinearly to 0, but not quadratically.
{γ2n} converges quadratically to zero.
Superlinear convergence is much faster than linear convergences,but quadratic convergence is much, much faster than superlinearconvergence.
γ =1
2gives γn = 2−n, γn
2= 2−n
2, γ2
n= 2−2
n
Rates of Covergence and Newton’s Method
![Page 12: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/12.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence: Example
Let γ ∈ (0, 1).{γn} converges linearly to zero, but not superlinearly.
{γn2} converges superlinearly to 0, but not quadratically.
{γ2n} converges quadratically to zero.
Superlinear convergence is much faster than linear convergences,but quadratic convergence is much, much faster than superlinearconvergence.
γ =1
2gives γn = 2−n, γn
2= 2−n
2, γ2
n= 2−2
n
Rates of Covergence and Newton’s Method
![Page 13: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/13.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence: Example
Let γ ∈ (0, 1).{γn} converges linearly to zero, but not superlinearly.
{γn2} converges superlinearly to 0, but not quadratically.
{γ2n} converges quadratically to zero.
Superlinear convergence is much faster than linear convergences,but quadratic convergence is much, much faster than superlinearconvergence.
γ =1
2gives γn = 2−n, γn
2= 2−n
2, γ2
n= 2−2
n
Rates of Covergence and Newton’s Method
![Page 14: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/14.jpg)
Outline Rates of Convergence Newton’s Method
Rates of Convergence: Example
Let γ ∈ (0, 1).{γn} converges linearly to zero, but not superlinearly.
{γn2} converges superlinearly to 0, but not quadratically.
{γ2n} converges quadratically to zero.
Superlinear convergence is much faster than linear convergences,but quadratic convergence is much, much faster than superlinearconvergence.
γ =1
2gives γn = 2−n, γn
2= 2−n
2, γ2
n= 2−2
n
Rates of Covergence and Newton’s Method
![Page 15: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/15.jpg)
Outline Rates of Convergence Newton’s Method
Example
Let f (x) = x2 + ex .f is a strongly convex function with
f (x) = x2 + ex
f ′(x) = 2x + ex
f ′′(x) = 2 + ex > 2
f ′′′(x) = ex .
If we apply the steepest descent algorithm with backtracking(γ = 1/2, c = 0.01) initiated at x0 = 1.
Rates of Covergence and Newton’s Method
![Page 16: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/16.jpg)
Outline Rates of Convergence Newton’s Method
Example: Steepest Descent
k xk f (xk) f ′(xk) s0 1 .37182818 4.7182818 01 0 1 1 02 −.5 .8565307 −0.3934693 13 −.25 .8413008 0.2788008 24 −.375 .8279143 −.0627107 35 −.34075 .8273473 .0297367 56 −.356375 .8272131 −.01254 67 −.3485625 .8271976 .0085768 78 −.3524688 .8271848 −.001987 89 −.3514922 .8271841 .0006528 10
10 −.3517364 .827184 −.0000072 12
Rates of Covergence and Newton’s Method
![Page 17: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/17.jpg)
Outline Rates of Convergence Newton’s Method
Example: Newton’s Method
min f (x) := x2 + ex
xk+1 = xk − f ′(xk)
f ′′(xk)
x f ′(x)
1 4.71828180 1−1/3 .0498646
−.3516893 .00012−.3517337 .00000000064
In addition, one more iteration gives |f ′(x5)| ≤ 10−20.
Rates of Covergence and Newton’s Method
![Page 18: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/18.jpg)
Outline Rates of Convergence Newton’s Method
Example: Newton’s Method
min f (x) := x2 + ex
xk+1 = xk − f ′(xk)
f ′′(xk)
x f ′(x)
1 4.71828180 1−1/3 .0498646
−.3516893 .00012−.3517337 .00000000064
In addition, one more iteration gives |f ′(x5)| ≤ 10−20.
Rates of Covergence and Newton’s Method
![Page 19: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/19.jpg)
Outline Rates of Convergence Newton’s Method
Example: Newton’s Method
min f (x) := x2 + ex
xk+1 = xk − f ′(xk)
f ′′(xk)
x f ′(x)
1 4.71828180 1−1/3 .0498646
−.3516893 .00012−.3517337 .00000000064
In addition, one more iteration gives |f ′(x5)| ≤ 10−20.
Rates of Covergence and Newton’s Method
![Page 20: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/20.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method: the Gold Standard
Newton’s method is an algorithm for solving nonlinear equations.
Given g : Rn → Rn, find x ∈ Rn for which g(x) = 0.
Linearize and Solve:Given a current estimate of a solution x0 obtain a new estimate x1
as the solution to the equation
0 = g(x0) + g ′(x0)(x − x0) ,
and repeat.
Rates of Covergence and Newton’s Method
![Page 21: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/21.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method: the Gold Standard
Newton’s method is an algorithm for solving nonlinear equations.
Given g : Rn → Rn, find x ∈ Rn for which g(x) = 0.
Linearize and Solve:Given a current estimate of a solution x0 obtain a new estimate x1
as the solution to the equation
0 = g(x0) + g ′(x0)(x − x0) ,
and repeat.
Rates of Covergence and Newton’s Method
![Page 22: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/22.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method: the Gold Standard
Newton’s method is an algorithm for solving nonlinear equations.
Given g : Rn → Rn, find x ∈ Rn for which g(x) = 0.
Linearize and Solve:
Given a current estimate of a solution x0 obtain a new estimate x1
as the solution to the equation
0 = g(x0) + g ′(x0)(x − x0) ,
and repeat.
Rates of Covergence and Newton’s Method
![Page 23: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/23.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method: the Gold Standard
Newton’s method is an algorithm for solving nonlinear equations.
Given g : Rn → Rn, find x ∈ Rn for which g(x) = 0.
Linearize and Solve:Given a current estimate of a solution x0 obtain a new estimate x1
as the solution to the equation
0 = g(x0) + g ′(x0)(x − x0) ,
and repeat.
Rates of Covergence and Newton’s Method
![Page 24: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/24.jpg)
Outline Rates of Convergence Newton’s Method
Newton Like Methods
xk+1 := xk − [g ′(xk)]−1g(xk)
Newton-Like Methods:
xk+1 := xk − Jkg(xk)
whereJk ≈ g ′(xk)
Rates of Covergence and Newton’s Method
![Page 25: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/25.jpg)
Outline Rates of Convergence Newton’s Method
Newton Like Methods
xk+1 := xk − [g ′(xk)]−1g(xk)
Newton-Like Methods:
xk+1 := xk − Jkg(xk)
whereJk ≈ g ′(xk)
Rates of Covergence and Newton’s Method
![Page 26: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/26.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
Let g : Rn → Rn be differentiable, x0 ∈ Rn, and J0 ∈ Rn×n. Supposethat there exists x̄ , x0 ∈ Rn, and ε > 0 with ‖x0 − x̄‖ < ε such that
1. g(x) = 0,
2. g ′(x)−1 exists for x ∈ B(x ; ε) := {x ∈ Rn : ‖x − x‖ < ε} with
sup{‖g ′(x)−1‖ : x ∈ B(x ; ε)] ≤ M1
3. g ′ is Lipschitz continuous on c`B(x ; ε) with Lipschitz constant L,and
4. θ0 := LM1
2 ‖x0 − x‖+ M0K < 1 where K ≥ ‖(g ′(x0)−1 − J0)y0‖,
y0 := g(x0)/‖g(x0)‖, and M0 = max{‖g ′(x)‖ : x ∈ B(x ; ε)}.
Rates of Covergence and Newton’s Method
![Page 27: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/27.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
Let g : Rn → Rn be differentiable, x0 ∈ Rn, and J0 ∈ Rn×n. Supposethat there exists x̄ , x0 ∈ Rn, and ε > 0 with ‖x0 − x̄‖ < ε such that
1. g(x) = 0,
2. g ′(x)−1 exists for x ∈ B(x ; ε) := {x ∈ Rn : ‖x − x‖ < ε} with
sup{‖g ′(x)−1‖ : x ∈ B(x ; ε)] ≤ M1
3. g ′ is Lipschitz continuous on c`B(x ; ε) with Lipschitz constant L,and
4. θ0 := LM1
2 ‖x0 − x‖+ M0K < 1 where K ≥ ‖(g ′(x0)−1 − J0)y0‖,
y0 := g(x0)/‖g(x0)‖, and M0 = max{‖g ′(x)‖ : x ∈ B(x ; ε)}.
Rates of Covergence and Newton’s Method
![Page 28: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/28.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
Let g : Rn → Rn be differentiable, x0 ∈ Rn, and J0 ∈ Rn×n. Supposethat there exists x̄ , x0 ∈ Rn, and ε > 0 with ‖x0 − x̄‖ < ε such that
1. g(x) = 0,
2. g ′(x)−1 exists for x ∈ B(x ; ε) := {x ∈ Rn : ‖x − x‖ < ε} with
sup{‖g ′(x)−1‖ : x ∈ B(x ; ε)] ≤ M1
3. g ′ is Lipschitz continuous on c`B(x ; ε) with Lipschitz constant L,and
4. θ0 := LM1
2 ‖x0 − x‖+ M0K < 1 where K ≥ ‖(g ′(x0)−1 − J0)y0‖,
y0 := g(x0)/‖g(x0)‖, and M0 = max{‖g ′(x)‖ : x ∈ B(x ; ε)}.
Rates of Covergence and Newton’s Method
![Page 29: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/29.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
Let g : Rn → Rn be differentiable, x0 ∈ Rn, and J0 ∈ Rn×n. Supposethat there exists x̄ , x0 ∈ Rn, and ε > 0 with ‖x0 − x̄‖ < ε such that
1. g(x) = 0,
2. g ′(x)−1 exists for x ∈ B(x ; ε) := {x ∈ Rn : ‖x − x‖ < ε} with
sup{‖g ′(x)−1‖ : x ∈ B(x ; ε)] ≤ M1
3. g ′ is Lipschitz continuous on c`B(x ; ε) with Lipschitz constant L,and
4. θ0 := LM1
2 ‖x0 − x‖+ M0K < 1 where K ≥ ‖(g ′(x0)−1 − J0)y0‖,
y0 := g(x0)/‖g(x0)‖, and M0 = max{‖g ′(x)‖ : x ∈ B(x ; ε)}.
Rates of Covergence and Newton’s Method
![Page 30: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/30.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
Let g : Rn → Rn be differentiable, x0 ∈ Rn, and J0 ∈ Rn×n. Supposethat there exists x̄ , x0 ∈ Rn, and ε > 0 with ‖x0 − x̄‖ < ε such that
1. g(x) = 0,
2. g ′(x)−1 exists for x ∈ B(x ; ε) := {x ∈ Rn : ‖x − x‖ < ε} with
sup{‖g ′(x)−1‖ : x ∈ B(x ; ε)] ≤ M1
3. g ′ is Lipschitz continuous on c`B(x ; ε) with Lipschitz constant L,and
4. θ0 := LM1
2 ‖x0 − x‖+ M0K < 1 where K ≥ ‖(g ′(x0)−1 − J0)y0‖,
y0 := g(x0)/‖g(x0)‖, and M0 = max{‖g ′(x)‖ : x ∈ B(x ; ε)}.
Rates of Covergence and Newton’s Method
![Page 31: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/31.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
Further suppose that iteration is initiated at x0 where the Jk ’s are chosento satisfy one of the following conditions;
(i) ‖(g ′(xk)−1 − Jk)yk‖ ≤ K ,
(ii) ‖(g ′(xk)−1 − Jk)yk‖ ≤ θk1K for some θ1 ∈ (0, 1),
(iii) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖xk − xk−1‖,K}, for some M2 > 0,or
(iv) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖g(xk)‖,K}, for some M3 > 0,
where for each k = 1, 2, . . . , yk := g(xk)/‖g(xk)‖.
Rates of Covergence and Newton’s Method
![Page 32: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/32.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
Further suppose that iteration is initiated at x0 where the Jk ’s are chosento satisfy one of the following conditions;
(i) ‖(g ′(xk)−1 − Jk)yk‖ ≤ K ,
(ii) ‖(g ′(xk)−1 − Jk)yk‖ ≤ θk1K for some θ1 ∈ (0, 1),
(iii) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖xk − xk−1‖,K}, for some M2 > 0,or
(iv) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖g(xk)‖,K}, for some M3 > 0,
where for each k = 1, 2, . . . , yk := g(xk)/‖g(xk)‖.
Rates of Covergence and Newton’s Method
![Page 33: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/33.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
Further suppose that iteration is initiated at x0 where the Jk ’s are chosento satisfy one of the following conditions;
(i) ‖(g ′(xk)−1 − Jk)yk‖ ≤ K ,
(ii) ‖(g ′(xk)−1 − Jk)yk‖ ≤ θk1K for some θ1 ∈ (0, 1),
(iii) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖xk − xk−1‖,K}, for some M2 > 0,or
(iv) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖g(xk)‖,K}, for some M3 > 0,
where for each k = 1, 2, . . . , yk := g(xk)/‖g(xk)‖.
Rates of Covergence and Newton’s Method
![Page 34: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/34.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
Further suppose that iteration is initiated at x0 where the Jk ’s are chosento satisfy one of the following conditions;
(i) ‖(g ′(xk)−1 − Jk)yk‖ ≤ K ,
(ii) ‖(g ′(xk)−1 − Jk)yk‖ ≤ θk1K for some θ1 ∈ (0, 1),
(iii) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖xk − xk−1‖,K}, for some M2 > 0,or
(iv) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖g(xk)‖,K}, for some M3 > 0,
where for each k = 1, 2, . . . , yk := g(xk)/‖g(xk)‖.
Rates of Covergence and Newton’s Method
![Page 35: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/35.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
Further suppose that iteration is initiated at x0 where the Jk ’s are chosento satisfy one of the following conditions;
(i) ‖(g ′(xk)−1 − Jk)yk‖ ≤ K ,
(ii) ‖(g ′(xk)−1 − Jk)yk‖ ≤ θk1K for some θ1 ∈ (0, 1),
(iii) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖xk − xk−1‖,K}, for some M2 > 0,or
(iv) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖g(xk)‖,K}, for some M3 > 0,
where for each k = 1, 2, . . . , yk := g(xk)/‖g(xk)‖.
Rates of Covergence and Newton’s Method
![Page 36: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/36.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
These hypotheses on the accuracy of the approximations Jk yield thefollowing conclusions about the rate of convergence of the iterates xk .
(a) ‖(g ′(xk)−1 − Jk)yk‖ ≤ K =⇒ xk → x linearly.
(b) ‖(g ′(xk)−1 − Jk)yk‖ ≤ θk1K =⇒ xk → x superlinearly.
(c) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖xk − xk−1‖,K} =⇒ xk → x twostep quadratically.
(d) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖g(xk)‖,K =⇒ xk → xquadratically.
Rates of Covergence and Newton’s Method
![Page 37: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/37.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
These hypotheses on the accuracy of the approximations Jk yield thefollowing conclusions about the rate of convergence of the iterates xk .
(a) ‖(g ′(xk)−1 − Jk)yk‖ ≤ K =⇒ xk → x linearly.
(b) ‖(g ′(xk)−1 − Jk)yk‖ ≤ θk1K =⇒ xk → x superlinearly.
(c) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖xk − xk−1‖,K} =⇒ xk → x twostep quadratically.
(d) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖g(xk)‖,K =⇒ xk → xquadratically.
Rates of Covergence and Newton’s Method
![Page 38: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/38.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
These hypotheses on the accuracy of the approximations Jk yield thefollowing conclusions about the rate of convergence of the iterates xk .
(a) ‖(g ′(xk)−1 − Jk)yk‖ ≤ K =⇒ xk → x linearly.
(b) ‖(g ′(xk)−1 − Jk)yk‖ ≤ θk1K =⇒ xk → x superlinearly.
(c) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖xk − xk−1‖,K} =⇒ xk → x twostep quadratically.
(d) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖g(xk)‖,K =⇒ xk → xquadratically.
Rates of Covergence and Newton’s Method
![Page 39: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/39.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
These hypotheses on the accuracy of the approximations Jk yield thefollowing conclusions about the rate of convergence of the iterates xk .
(a) ‖(g ′(xk)−1 − Jk)yk‖ ≤ K =⇒ xk → x linearly.
(b) ‖(g ′(xk)−1 − Jk)yk‖ ≤ θk1K =⇒ xk → x superlinearly.
(c) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖xk − xk−1‖,K} =⇒ xk → x twostep quadratically.
(d) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖g(xk)‖,K =⇒ xk → xquadratically.
Rates of Covergence and Newton’s Method
![Page 40: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/40.jpg)
Outline Rates of Convergence Newton’s Method
Convergence of Newton’s Method
These hypotheses on the accuracy of the approximations Jk yield thefollowing conclusions about the rate of convergence of the iterates xk .
(a) ‖(g ′(xk)−1 − Jk)yk‖ ≤ K =⇒ xk → x linearly.
(b) ‖(g ′(xk)−1 − Jk)yk‖ ≤ θk1K =⇒ xk → x superlinearly.
(c) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖xk − xk−1‖,K} =⇒ xk → x twostep quadratically.
(d) ‖(g ′(xk)−1 − Jk)yk‖ ≤ min{M2‖g(xk)‖,K =⇒ xk → xquadratically.
Rates of Covergence and Newton’s Method
![Page 41: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/41.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Let f : Rn → R be twice continuously differentiable, x0 ∈ Rn, andH0 ∈ Rn×n. Suppose that
1. there exists x ∈ Rn and ε > ‖x0 − x̄‖ such that f (x) ≤ f (x)whenever ‖x − x̄‖ ≤ ε,
2. there is a δ > 0 such that δ‖z‖22 ≤ zT∇2f (x)z for all x ∈ B(x , ε),
3. ∇2f is Lipschitz continuous on clB(x ; ε) with Lipschitz constant L,and
4. θ0 := L2δ‖x
0 − x‖+ M0K < 1 where M0 > 0 satisfieszT∇2f (x)z ≤ M0‖z‖22 for all x ∈ B(x , ε) andK ≥ ‖(∇2f (x0)−1 − H0)y0‖ with y0 = ∇f (x0)/‖∇f (x0)‖.
Rates of Covergence and Newton’s Method
![Page 42: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/42.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Let f : Rn → R be twice continuously differentiable, x0 ∈ Rn, andH0 ∈ Rn×n. Suppose that
1. there exists x ∈ Rn and ε > ‖x0 − x̄‖ such that f (x) ≤ f (x)whenever ‖x − x̄‖ ≤ ε,
2. there is a δ > 0 such that δ‖z‖22 ≤ zT∇2f (x)z for all x ∈ B(x , ε),
3. ∇2f is Lipschitz continuous on clB(x ; ε) with Lipschitz constant L,and
4. θ0 := L2δ‖x
0 − x‖+ M0K < 1 where M0 > 0 satisfieszT∇2f (x)z ≤ M0‖z‖22 for all x ∈ B(x , ε) andK ≥ ‖(∇2f (x0)−1 − H0)y0‖ with y0 = ∇f (x0)/‖∇f (x0)‖.
Rates of Covergence and Newton’s Method
![Page 43: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/43.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Let f : Rn → R be twice continuously differentiable, x0 ∈ Rn, andH0 ∈ Rn×n. Suppose that
1. there exists x ∈ Rn and ε > ‖x0 − x̄‖ such that f (x) ≤ f (x)whenever ‖x − x̄‖ ≤ ε,
2. there is a δ > 0 such that δ‖z‖22 ≤ zT∇2f (x)z for all x ∈ B(x , ε),
3. ∇2f is Lipschitz continuous on clB(x ; ε) with Lipschitz constant L,and
4. θ0 := L2δ‖x
0 − x‖+ M0K < 1 where M0 > 0 satisfieszT∇2f (x)z ≤ M0‖z‖22 for all x ∈ B(x , ε) andK ≥ ‖(∇2f (x0)−1 − H0)y0‖ with y0 = ∇f (x0)/‖∇f (x0)‖.
Rates of Covergence and Newton’s Method
![Page 44: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/44.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Let f : Rn → R be twice continuously differentiable, x0 ∈ Rn, andH0 ∈ Rn×n. Suppose that
1. there exists x ∈ Rn and ε > ‖x0 − x̄‖ such that f (x) ≤ f (x)whenever ‖x − x̄‖ ≤ ε,
2. there is a δ > 0 such that δ‖z‖22 ≤ zT∇2f (x)z for all x ∈ B(x , ε),
3. ∇2f is Lipschitz continuous on clB(x ; ε) with Lipschitz constant L,and
4. θ0 := L2δ‖x
0 − x‖+ M0K < 1 where M0 > 0 satisfieszT∇2f (x)z ≤ M0‖z‖22 for all x ∈ B(x , ε) andK ≥ ‖(∇2f (x0)−1 − H0)y0‖ with y0 = ∇f (x0)/‖∇f (x0)‖.
Rates of Covergence and Newton’s Method
![Page 45: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/45.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Let f : Rn → R be twice continuously differentiable, x0 ∈ Rn, andH0 ∈ Rn×n. Suppose that
1. there exists x ∈ Rn and ε > ‖x0 − x̄‖ such that f (x) ≤ f (x)whenever ‖x − x̄‖ ≤ ε,
2. there is a δ > 0 such that δ‖z‖22 ≤ zT∇2f (x)z for all x ∈ B(x , ε),
3. ∇2f is Lipschitz continuous on clB(x ; ε) with Lipschitz constant L,and
4. θ0 := L2δ‖x
0 − x‖+ M0K < 1 where M0 > 0 satisfieszT∇2f (x)z ≤ M0‖z‖22 for all x ∈ B(x , ε) andK ≥ ‖(∇2f (x0)−1 − H0)y0‖ with y0 = ∇f (x0)/‖∇f (x0)‖.
Rates of Covergence and Newton’s Method
![Page 46: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/46.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Let f : Rn → R be twice continuously differentiable, x0 ∈ Rn, andH0 ∈ Rn×n. Suppose that
1. there exists x ∈ Rn and ε > ‖x0 − x̄‖ such that f (x) ≤ f (x)whenever ‖x − x̄‖ ≤ ε,
2. there is a δ > 0 such that δ‖z‖22 ≤ zT∇2f (x)z for all x ∈ B(x , ε),
3. ∇2f is Lipschitz continuous on clB(x ; ε) with Lipschitz constant L,and
4. θ0 := L2δ‖x
0 − x‖+ M0K < 1 where M0 > 0 satisfieszT∇2f (x)z ≤ M0‖z‖22 for all x ∈ B(x , ε) andK ≥ ‖(∇2f (x0)−1 − H0)y0‖ with y0 = ∇f (x0)/‖∇f (x0)‖.
Rates of Covergence and Newton’s Method
![Page 47: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/47.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Further, suppose that the iteration
xk+1 := xk − Hk∇f (xk)
is initiated at x0 where the Hk ’s are chosen to satisfy one of the followingconditions:
(i) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ K ,
(ii) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ θk1K for some θ1 ∈ (0, 1),
(iii) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ min{M2‖xk − xk−1‖,K}, for someM2 > 0, or
(iv) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ min{M2‖∇f (xk)‖,K}, for some M3 > 0,
where for each k = 1, 2, . . . yk := ∇f (xk)/‖∇f (xk)‖.
Rates of Covergence and Newton’s Method
![Page 48: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/48.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Further, suppose that the iteration
xk+1 := xk − Hk∇f (xk)
is initiated at x0 where the Hk ’s are chosen to satisfy one of the followingconditions:
(i) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ K ,
(ii) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ θk1K for some θ1 ∈ (0, 1),
(iii) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ min{M2‖xk − xk−1‖,K}, for someM2 > 0, or
(iv) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ min{M2‖∇f (xk)‖,K}, for some M3 > 0,
where for each k = 1, 2, . . . yk := ∇f (xk)/‖∇f (xk)‖.
Rates of Covergence and Newton’s Method
![Page 49: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/49.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Further, suppose that the iteration
xk+1 := xk − Hk∇f (xk)
is initiated at x0 where the Hk ’s are chosen to satisfy one of the followingconditions:
(i) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ K ,
(ii) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ θk1K for some θ1 ∈ (0, 1),
(iii) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ min{M2‖xk − xk−1‖,K}, for someM2 > 0, or
(iv) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ min{M2‖∇f (xk)‖,K}, for some M3 > 0,
where for each k = 1, 2, . . . yk := ∇f (xk)/‖∇f (xk)‖.
Rates of Covergence and Newton’s Method
![Page 50: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/50.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Further, suppose that the iteration
xk+1 := xk − Hk∇f (xk)
is initiated at x0 where the Hk ’s are chosen to satisfy one of the followingconditions:
(i) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ K ,
(ii) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ θk1K for some θ1 ∈ (0, 1),
(iii) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ min{M2‖xk − xk−1‖,K}, for someM2 > 0, or
(iv) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ min{M2‖∇f (xk)‖,K}, for some M3 > 0,
where for each k = 1, 2, . . . yk := ∇f (xk)/‖∇f (xk)‖.
Rates of Covergence and Newton’s Method
![Page 51: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/51.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
Further, suppose that the iteration
xk+1 := xk − Hk∇f (xk)
is initiated at x0 where the Hk ’s are chosen to satisfy one of the followingconditions:
(i) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ K ,
(ii) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ θk1K for some θ1 ∈ (0, 1),
(iii) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ min{M2‖xk − xk−1‖,K}, for someM2 > 0, or
(iv) ‖(∇2f (xk)−1 − Hk)yk‖ ≤ min{M2‖∇f (xk)‖,K}, for some M3 > 0,
where for each k = 1, 2, . . . yk := ∇f (xk)/‖∇f (xk)‖.
Rates of Covergence and Newton’s Method
![Page 52: Rates of Covergence and Newton's Methodburke/crs/408/... · 2012-01-31 · OutlineRates of ConvergenceNewton’s Method Rates of Convergence We compare the performance of algorithms](https://reader033.vdocuments.site/reader033/viewer/2022041710/5e477b66403e8261b122fb3f/html5/thumbnails/52.jpg)
Outline Rates of Convergence Newton’s Method
Newton’s Method for Minimization: ∇f (x) = 0
These hypotheses on the accuracy of the approximations Hk yield thefollowing conclusions about the rate of convergence of the iterates xk .
(a) If (i) holds, then xk → x linearly.
(b) If (ii) holds, then xk → x superlinearly.
(c) If (iii) holds, then xε → x two step quadratically.
(d) If (iv) holds, then xk → k quadradically.
Rates of Covergence and Newton’s Method