chapter 7 optimization

25
Chapter 7 Chapter 7 Optimization Optimization

Upload: myra-charles

Post on 02-Jan-2016

52 views

Category:

Documents


0 download

DESCRIPTION

Chapter 7 Optimization. Content. Introduction One dimensional unconstrained Multidimensional unconstrained Example. Introduction (1). Root finding & optimization are closely related !!. Optimization. Root finding. Introduction (2). An optimization problem - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Chapter 7 Optimization

Chapter 7Chapter 7

OptimizationOptimization

Page 2: Chapter 7 Optimization

ContentContent

•Introduction•One dimensional unconstrained •Multidimensional unconstrained•Example

Page 3: Chapter 7 Optimization

Introduction (1)Root finding & optimization are closely related !!

Optimization

Root finding

Page 4: Chapter 7 Optimization

Introduction (2)An optimization problem

“Find x, which minimizes or maximizes f(x) subject to…

*,,2,1)(

*,,2,1)(

pibxe

miaxd

ii

ii

x is an n-dimensional design vector

f(x) is the objective function

di(x) are inequality constraints

ei(x) are equality constraints

ai and bi are constants ”

Page 5: Chapter 7 Optimization

Introduction (3)Classification

Constrained optimization problem 1) Linear programming : f(x) & constraints are linear

2) Quadratic programming : f(x) is quadratic and constraints are linear

3) Nonlinear programming : f(x) is either quadratic or

nonlinear and/or

constraints are nonlinear

Unconstrained optimization problem

Page 6: Chapter 7 Optimization

One-dimensional (1)

Multimodal function

interested in finding the absolute highest or lowest value of a function.

Page 7: Chapter 7 Optimization

One-dimensional (2)

Graphing to gain insight into the behavior of the function.

Using randomly generated starting guesses and picking the largest of the optima as global.

Perturbing the starting point to see if the routine returns a better point or the same local minimum.

To distinguish global minimum (or maximum) fromlocal ones we can try

Page 8: Chapter 7 Optimization

One-dimensional (3)Golden section search (for a unimodal function)A unimodal function has a single maximum or a

minimum in the a given interval. Step of GSS

Pick two points that will bracket your extremum [xl, xu].Pick an additional third point within this interval to

determine whether a maximum occurred.Then pick a fourth point to determine whether the

maximum has occurred within the first three or last three points

The key is making this approach efficient by choosing intermediate points wisely thus minimizing the function evaluations by replacing the old values with new values.

Page 9: Chapter 7 Optimization

One-dimensional (4)

1

2

0

1

210

l

l

l

l

lll

GSS principle is always keep the ratio of section length equal @ each iteration

61803.02

15

2

)1(411

011

1 2

1

2

1

2

21

1

Rhence

R

RRR

R

l

lR

l

l

ll

l

Page 10: Chapter 7 Optimization

One-dimensional (5)Step I:

Starts with two initial guesses (xl ,xu)

Step II:

Calculate points x1,x2 according to the golden ratio where

dxx

dxx

xxd

u

l

lu

2

1

)(2

15

Step III:

Evaluate function at x1 and x2

Page 11: Chapter 7 Optimization

One-dimensional (6)Step IV:

if f(x1)> f(x2) xU = x1

if f(x1)< f(x2) xL= x2

Step V:

Calculate new x1

)(2

151 lul xxxx

x

Try with the following example !!

Page 12: Chapter 7 Optimization

One-dimensional (7)Ex Use the GSS to find the minimum of

xx

xf sin210

)(2

assume that xl = 0 and xu = 4

Answer x = 1.4276

Page 13: Chapter 7 Optimization

Multidimensional… (1)Techniques to find minimum and maximum of a

function of several variables.Classified as:

Requires derivative evaluationGradient or descent (or ascent) methods

Not require derivative evaluationNon-gradient or direct methods.

Page 14: Chapter 7 Optimization

Multidimensional… (2)

Page 15: Chapter 7 Optimization

Multidimensional… (3)DIRECT METHODS :Random SearchBased on evaluation of the function randomly at

selected values of the independent variables.If a sufficient number of samples are conducted,

the optimum will be eventually located.

Ex: Locate the maximum of a function

f (x, y)=y-x-2x2-2xy-y2

Page 16: Chapter 7 Optimization

Multidimensional… (4)

Page 17: Chapter 7 Optimization

Multidimensional… (5)Advantages/Works even for discontinuous and nondifferentiable

functions.Always finds the global optimum rather than the

global minimum.

Disadvantages/As the number of independent variables grows, the

task can become onerous.Not efficient, it does not account for the behavior of

underlying function.

Page 18: Chapter 7 Optimization

Multidimensional… (6)Univariate and Pattern Searches

More efficient than random search and still doesn’t require derivative evaluation.

The basic strategy is:Change one variable at a time while the other

variables are held constant.Thus problem is reduced to a sequence of one-

dimensional searches that can be solved by variety of methods.

The search becomes less efficient as you approach the maximum.

Page 19: Chapter 7 Optimization

Multidimensional… (7)Univariate and Pattern Searches

Page 20: Chapter 7 Optimization

Multidimensional… (8)Gradient method

If f(x,y) is a two dimensional function, the gradient vector tells usWhat direction is the steepest ascend?How much we will gain by taking that step?

del fy

f

x

ff or ji

Directional derivative of f(x,y) at point x=a and y=b

Page 21: Chapter 7 Optimization

Multidimensional… (9)Gradient method (cont’d)•For n dimensions

)(

)(

)(

)(2

1

x

x

x

x

nx

f

x

fx

f

f

Page 22: Chapter 7 Optimization

Multidimensional… (10)Steepest ascent method

Use the gradient vector to change x

Says that the we should change with along h-axis

h

f

f

i

iii

)(

)(

)1(

)1()1()(

x

xxx

The gradient vector represents a direction of maximum rate of increase for the function f(x) at x .

Page 23: Chapter 7 Optimization

Multidimensional… (11)Example

Minimize the function

2225),( yxyxf

where x = 0.6, y = 4

Page 24: Chapter 7 Optimization

Multidimensional… (11)Example

Maximize the function

22 222),( yxxxyyxf

where x = -1, y = 1

Answer x = 2, y = 1

Page 25: Chapter 7 Optimization

Multidimensional… (11)Example

Maximize the function

2225),( yxyxf

where x = 0.6, y = 4