for more details on this topic go to click on keyword

30
Numerical Methods Multidimensional Gradient Methods in Optimization- Theory http://nm.mathforcollege.com

Upload: flint

Post on 24-Feb-2016

29 views

Category:

Documents


0 download

DESCRIPTION

Numerical Methods Multidimensional Gradient Methods in Optimization- Theory http://nm.mathforcollege.com. For more details on this topic Go to http://nm.mathforcollege.com Click on Keyword Click on Multidimensional Gradient Methods in Optimization. You are free. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: For more details on this topic  Go to   Click on Keyword

Numerical Methods Multidimensional Gradient Methods in Optimization- Theoryhttp://nm.mathforcollege.com

Page 2: For more details on this topic  Go to   Click on Keyword

For more details on this topic

Go to http://nm.mathforcollege.com Click on Keyword Click on Multidimensional Gradient

Methods in Optimization

Page 3: For more details on this topic  Go to   Click on Keyword

You are free to Share – to copy, distribute,

display and perform the work to Remix – to make derivative works

Page 4: For more details on this topic  Go to   Click on Keyword

Under the following conditions Attribution — You must attribute the

work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work).

Noncommercial — You may not use this work for commercial purposes.

Share Alike — If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar license to this one.

Page 5: For more details on this topic  Go to   Click on Keyword

Multidimensional Gradient Methods -Overview

Use information from the derivatives of the optimization function to guide the search

Finds solutions quicker compared with direct search methods

A good initial estimate of the solution is required

The objective function needs to be differentiable

5http://nm.mathforcollege.com

Page 6: For more details on this topic  Go to   Click on Keyword

Gradients The gradient is a vector operator denoted by

(referred to as “del”) When applied to a function , it represents the

functions directional derivatives The gradient is the special case where the

direction of the gradient is the direction of most or the steepest ascent/descent

The gradient is calculated by

6

jiyf

xff

http://nm.mathforcollege.com

Page 7: For more details on this topic  Go to   Click on Keyword

Gradients-ExampleCalculate the gradient to determine the direction

of the steepest slope at point (2, 1) for the function

Solution: To calculate the gradient we would need to calculate

which are used to determine the gradient at point (2,1) as

7

22, yxyxf

4)1)(2(22 22 xyxf 8)1()2(22 22

yxyf

j8i4 f

http://nm.mathforcollege.com

Page 8: For more details on this topic  Go to   Click on Keyword

Hessians The Hessian matrix or just the Hessian is the

Jacobian matrix of second-order partial derivatives of a function.

The determinant of the Hessian matrix is also referred to as the Hessian.

For a two dimensional function the Hessian matrix is simply

8

2

22

2

2

2

yf

xyf

yxf

xf

H

http://nm.mathforcollege.com

Page 9: For more details on this topic  Go to   Click on Keyword

Hessians cont.The determinant of the Hessian matrix

denoted by can have three cases:1. If and then has a

local minimum.2. If and then has a

local maximum.3. If then has a saddle point.

9

H0H 0/ 222 xf yxf ,

0H 0/ 222 xf yxf ,

0H yxf ,

http://nm.mathforcollege.com

Page 10: For more details on this topic  Go to   Click on Keyword

Hessians-ExampleCalculate the hessian matrix at point (2, 1) for

the function Solution: To calculate the Hessian matrix; the

partial derivatives must be evaluated as

resulting in the Hessian matrix

10

22, yxyxf

2)1(22 2222

2

yxf 8)2(22 22

2

2

xyf 8)1)(2(44

22

xy

xyf

yxf

8882

2

22

2

2

2

yf

xyf

yxf

xf

H

http://nm.mathforcollege.com

Page 11: For more details on this topic  Go to   Click on Keyword

11

Steepest Ascent/Descent Method

Step1:Starts from an initial guessed point and looks for a local optimal solution along a gradient.

Step2:The gradient at the initial solution is calculated(or finding the direction to travel),compute .

)0( ix

,....],...,,[ min

2

min

1

minminmin

kk xf

xf

xf

xf

f

)0(x

)1(x

)2(x

.Optx

http://nm.mathforcollege.com

Page 12: For more details on this topic  Go to   Click on Keyword

Step3:Find the step size “h” along the Calculated (gradient) direction (using

Golden Section Method or Analytical Method).

Step4:A new solution is found at the local optimum along the gradient ,compute

Step5: If “converge”,such as then stop. Else, return to step 2 (using the newly computed point ).

12

Steepest Ascent/Descent Method

min)(1 fhxx ii

)1( ix

)10( 51

tolxif )(ix

http://nm.mathforcollege.com

Page 13: For more details on this topic  Go to   Click on Keyword

THE END

http://nm.mathforcollege.com

Page 14: For more details on this topic  Go to   Click on Keyword

This instructional power point brought to you byNumerical Methods for STEM undergraduatehttp://nm.mathforcollege.comCommitted to bringing numerical methods to the undergraduate

Acknowledgement

Page 15: For more details on this topic  Go to   Click on Keyword

For instructional videos on other topics, go to

http://nm.mathforcollege.com

This material is based upon work supported by the National Science Foundation under Grant # 0717624. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Page 16: For more details on this topic  Go to   Click on Keyword

The End - Really

Page 17: For more details on this topic  Go to   Click on Keyword

Numerical Methods Multidimensional Gradient Methods in Optimization- Examplehttp://nm.mathforcollege.com

http://nm.mathforcollege.com

Page 18: For more details on this topic  Go to   Click on Keyword

For more details on this topic

Go to http://nm.mathforcollege.com Click on Keyword Click on Multidimensional Gradient

Methods in Optimization

Page 19: For more details on this topic  Go to   Click on Keyword

You are free to Share – to copy, distribute,

display and perform the work to Remix – to make derivative works

Page 20: For more details on this topic  Go to   Click on Keyword

Under the following conditions Attribution — You must attribute the

work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work).

Noncommercial — You may not use this work for commercial purposes.

Share Alike — If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar license to this one.

Page 21: For more details on this topic  Go to   Click on Keyword

ExampleDetermine the minimum of the function

Use the poin (2, 1) as the initial estimate of the optimal solution.

21

42, 22 xyxyxf

)0(

)0()0(

yx

x

http://nm.mathforcollege.com

Page 22: For more details on this topic  Go to   Click on Keyword

22

SolutionIteration 1: To calculate the gradient; the partial derivatives must be evaluated asRecalled that

62)2(222 xxf

2)1(22

yyf

j2i6 f

hh

hx

fhxx

i

ii

2162

26

12)1(

)()1(

42),( 22 xyxyxf

http://nm.mathforcollege.com

Page 23: For more details on this topic  Go to   Click on Keyword

23

SolutionNow the function can be expressed along the direction of gradient as

To get ,we set

yxf ,

134040)( 2 hhhg

)(4)62(2)21()62()( 221 hghhhxf i

5.040800 hhdhdg

ming

http://nm.mathforcollege.com

Page 24: For more details on this topic  Go to   Click on Keyword

24

Solution Cont.Iteration 1 continued: This is a simple function and it is easy to determine by taking the first derivative and solving for its roots.

This means that traveling a step size of along the gradient reaches a minimum value for the function in this direction. These values are substituted back to calculate a new value for x and y as follows:

50.0* h

5.0h

0)5.0(211)5.0(62

yx

0.30,1 f 131,2 fNote that http://nm.mathforcollege.com

Page 25: For more details on this topic  Go to   Click on Keyword

25

Solution Cont.Iteration 2: The new initial point is .We calculate the gradient at this point as

0) ,1(

02)1(222 xxf

0)0(22 yyf

j)0(i)0(f

http://nm.mathforcollege.com

Page 26: For more details on this topic  Go to   Click on Keyword

26

Solution Cont.This indicates that the current location is a local optimum along this gradient and no improvement can be gained by moving in any direction. The minimum of the function is at point (-1,0),and .

34)1(2)0()1( 22min f

http://nm.mathforcollege.com

Page 27: For more details on this topic  Go to   Click on Keyword

THE END

http://nm.mathforcollege.com

Page 28: For more details on this topic  Go to   Click on Keyword

This instructional power point brought to you byNumerical Methods for STEM undergraduatehttp://nm.mathforcollege.comCommitted to bringing numerical methods to the undergraduate

Acknowledgement

Page 29: For more details on this topic  Go to   Click on Keyword

For instructional videos on other topics, go to

http://nm.mathforcollege.com

This material is based upon work supported by the National Science Foundation under Grant # 0717624. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Page 30: For more details on this topic  Go to   Click on Keyword

The End - Really