prof. (dr.) rajib kumar bhattacharjya

Post on 10-Nov-2021

9 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Multivariable problem with equality and inequality constraints

Prof. (Dr.) Rajib Kumar Bhattacharjya

Professor, Department of Civil EngineeringIndian Institute of Technology Guwahati, India

Room No. 005, M BlockEmail: rkbc@iitg.ernet.in, Ph. No 2428

Rajib Bhattacharjya, IITG CE 602: Optimization Method

General formulation

๐‘“ ๐‘‹Min/Max

Subject to ๐‘”๐‘— ๐‘‹ = 0 ๐‘— = 1,2,3, โ€ฆ ,๐‘š

Where ๐‘‹ = ๐‘ฅ1, ๐‘ฅ2, ๐‘ฅ3, โ€ฆ , ๐‘ฅ๐‘›๐‘‡

This is the minimum point of the function

g ๐‘‹ = 0

Now this is not the minimum point of the constrained function

This is the new minimum point

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Consider a two variable problem

๐‘“ ๐‘ฅ1, ๐‘ฅ2Min/Max

Subject to ๐‘” ๐‘ฅ1, ๐‘ฅ2 = 0

g ๐‘ฅ1, ๐‘ฅ2 = 0

๐‘ฅ1, ๐‘ฅ2

๐‘‘๐‘“ =๐œ•๐‘“

๐œ•๐‘ฅ1๐‘‘๐‘ฅ1 +

๐œ•๐‘“

๐œ•๐‘ฅ2๐‘‘๐‘ฅ2 = 0

Take total derivative of the function at ๐‘ฅ1, ๐‘ฅ2

If ๐‘ฅ1, ๐‘ฅ2 is the solution of the constrained problem, then

๐‘” ๐‘ฅ1, ๐‘ฅ2 = 0

Now any variation ๐‘‘๐‘ฅ1 and ๐‘‘๐‘ฅ2 is admissible only when

๐‘” ๐‘ฅ1 + ๐‘‘๐‘ฅ1, ๐‘ฅ2 + ๐‘‘๐‘ฅ2 = 0

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Consider a two variable problem

๐‘“ ๐‘ฅ1, ๐‘ฅ2Min/Max

Subject to ๐‘” ๐‘ฅ1, ๐‘ฅ2 = 0

g ๐‘ฅ1, ๐‘ฅ2 = 0

๐‘ฅ1, ๐‘ฅ2

๐‘” ๐‘ฅ1 + ๐‘‘๐‘ฅ1, ๐‘ฅ2 + ๐‘‘๐‘ฅ2 = 0

This can be expanded as

๐‘” ๐‘ฅ1 + ๐‘‘๐‘ฅ1, ๐‘ฅ2 + ๐‘‘๐‘ฅ2 = ๐‘” ๐‘ฅ1, ๐‘ฅ2 +๐œ•๐‘” ๐‘ฅ1,๐‘ฅ2

๐œ•๐‘ฅ1๐‘‘๐‘ฅ1 +

๐œ•๐‘” ๐‘ฅ1,๐‘ฅ2

๐œ•๐‘ฅ2๐‘‘๐‘ฅ2 = 0

0

๐‘‘๐‘” =๐œ•๐‘”

๐œ•๐‘ฅ1๐‘‘๐‘ฅ1 +

๐œ•๐‘”

๐œ•๐‘ฅ2๐‘‘๐‘ฅ2 = 0

๐‘‘๐‘ฅ2 = โˆ’

๐œ•๐‘”๐œ•๐‘ฅ1๐œ•๐‘”๐œ•๐‘ฅ2

๐‘‘๐‘ฅ1

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Consider a two variable problem

๐‘“ ๐‘ฅ1, ๐‘ฅ2Min/Max

Subject to ๐‘” ๐‘ฅ1, ๐‘ฅ2 = 0

g ๐‘ฅ1, ๐‘ฅ2 = 0

๐‘ฅ1, ๐‘ฅ2

๐‘‘๐‘ฅ2 = โˆ’

๐œ•๐‘”๐œ•๐‘ฅ1๐œ•๐‘”๐œ•๐‘ฅ2

๐‘‘๐‘ฅ1

Putting in ๐‘‘๐‘“ =๐œ•๐‘“

๐œ•๐‘ฅ1๐‘‘๐‘ฅ1 +

๐œ•๐‘“

๐œ•๐‘ฅ2๐‘‘๐‘ฅ2 = 0

๐‘‘๐‘“ =๐œ•๐‘“

๐œ•๐‘ฅ1๐‘‘๐‘ฅ1 โˆ’

๐œ•๐‘“

๐œ•๐‘ฅ2

๐œ•๐‘”๐œ•๐‘ฅ1๐œ•๐‘”๐œ•๐‘ฅ2

๐‘‘๐‘ฅ1 = 0

๐œ•๐‘“

๐œ•๐‘ฅ1

๐œ•๐‘”

๐œ•๐‘ฅ2โˆ’๐œ•๐‘“

๐œ•๐‘ฅ2

๐œ•๐‘”

๐œ•๐‘ฅ1๐‘‘๐‘ฅ1 = 0

This is the necessary condition for optimality for optimization problem with equality constraints

๐œ•๐‘“

๐œ•๐‘ฅ1

๐œ•๐‘”

๐œ•๐‘ฅ2โˆ’๐œ•๐‘“

๐œ•๐‘ฅ2

๐œ•๐‘”

๐œ•๐‘ฅ1= 0

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Lagrange Multipliers

๐‘“ ๐‘ฅ1, ๐‘ฅ2Min/Max

Subject to ๐‘” ๐‘ฅ1, ๐‘ฅ2 = 0

We have already obtained the condition that

By defining

๐œ•๐‘“

๐œ•๐‘ฅ1โˆ’๐œ•๐‘“

๐œ•๐‘ฅ2

๐œ•๐‘”

๐œ•๐‘ฅ1๐œ•๐‘”

๐œ•๐‘ฅ2

= 0๐œ•๐‘“

๐œ•๐‘ฅ1โˆ’

๐œ•๐‘“

๐œ•๐‘ฅ2๐œ•๐‘”

๐œ•๐‘ฅ2

๐œ•๐‘”

๐œ•๐‘ฅ1= 0

๐œ† = โˆ’

๐œ•๐‘“๐œ•๐‘ฅ2๐œ•๐‘”๐œ•๐‘ฅ2

We have๐œ•๐‘“

๐œ•๐‘ฅ1+ ๐œ†๐œ•๐‘”

๐œ•๐‘ฅ1= 0

We can also write

๐œ•๐‘“

๐œ•๐‘ฅ2+ ๐œ†๐œ•๐‘”

๐œ•๐‘ฅ2= 0

Necessary conditions for optimality

๐‘” ๐‘ฅ1, ๐‘ฅ2 = 0Also put

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Lagrange Multipliers

๐ฟ ๐‘ฅ1, ๐‘ฅ2, ๐œ† = ๐‘“ ๐‘ฅ1, ๐‘ฅ2 + ๐œ†๐‘” ๐‘ฅ1, ๐‘ฅ2

Let us define

By applying necessary condition of optimality, we can obtain

๐œ•๐ฟ

๐œ•๐‘ฅ1=๐œ•๐‘“

๐œ•๐‘ฅ1+ ๐œ†๐œ•๐‘”

๐œ•๐‘ฅ1= 0

๐œ•๐ฟ

๐œ•๐‘ฅ2=๐œ•๐‘“

๐œ•๐‘ฅ2+ ๐œ†๐œ•๐‘”

๐œ•๐‘ฅ2= 0

๐œ•๐ฟ

๐œ•๐œ†= ๐‘” ๐‘ฅ1, ๐‘ฅ2 = 0

Necessary conditions for optimality

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Lagrange Multipliers

๐ฟ ๐‘ฅ1, ๐‘ฅ2, ๐œ† = ๐‘“ ๐‘ฅ1, ๐‘ฅ2 + ๐œ†๐‘” ๐‘ฅ1, ๐‘ฅ2

Sufficient condition for optimality of the Lagrange function can be written as

๐ป =

๐œ•2๐ฟ

๐œ•๐‘ฅ1๐œ•๐‘ฅ1

๐œ•2๐ฟ

๐œ•๐‘ฅ1๐œ•๐‘ฅ2

๐œ•2๐ฟ

๐œ•๐‘ฅ1๐œ•๐œ†

๐œ•2๐ฟ

๐œ•๐‘ฅ2๐œ•๐‘ฅ1

๐œ•2๐ฟ

๐œ•๐‘ฅ2๐œ•๐‘ฅ2

๐œ•2๐ฟ

๐œ•๐‘ฅ2๐œ•๐œ†

๐œ•2๐ฟ

๐œ•๐œ†๐œ•๐‘ฅ1

๐œ•2๐ฟ

๐œ•๐œ†๐œ•๐‘ฅ2

๐œ•2๐ฟ

๐œ•๐œ†๐œ•๐œ†

๐ป =

๐œ•2๐ฟ

๐œ•๐‘ฅ1๐œ•๐‘ฅ1

๐œ•2๐ฟ

๐œ•๐‘ฅ1๐œ•๐‘ฅ2

๐œ•๐‘”

๐œ•๐‘ฅ1๐œ•2๐ฟ

๐œ•๐‘ฅ2๐œ•๐‘ฅ1

๐œ•2๐ฟ

๐œ•๐‘ฅ2๐œ•๐‘ฅ2

๐œ•๐‘”

๐œ•๐‘ฅ2๐œ•๐‘”

๐œ•๐‘ฅ1

๐œ•๐‘”

๐œ•๐‘ฅ20

If ๐ป is positive definite , the optimal solution is a minimum point

If ๐ป is negative definite , the optimal solution is a maximum point

Else it is neither minima nor maxima

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Lagrange Multipliers

Necessary conditions for general problem

๐‘“ ๐‘‹Min/Max

Subject to ๐‘”๐‘— ๐‘‹ = 0 ๐‘— = 1,2,3, โ€ฆ ,๐‘š

Where ๐‘‹ = ๐‘ฅ1, ๐‘ฅ2, ๐‘ฅ3, โ€ฆ , ๐‘ฅ๐‘›๐‘‡

๐ฟ ๐‘ฅ1, ๐‘ฅ2, โ€ฆ , ๐‘ฅ๐‘›, ๐œ†1, ๐œ†2, ๐œ†3โ€ฆ , ๐œ†๐‘š = ๐‘“ ๐‘‹ + ๐œ†1๐‘”1 ๐‘‹ + ๐œ†2๐‘”2 ๐‘‹ ,โ€ฆ , ๐œ†๐‘š๐‘”๐‘š ๐‘‹

Necessary conditions

๐œ•๐ฟ

๐œ•๐‘ฅ๐‘–=๐œ•๐‘“

๐œ•๐‘ฅ๐‘–+

๐‘—=1

๐‘š

๐œ†๐‘—๐œ•๐‘”๐‘—

๐œ•๐‘ฅ๐‘–= 0

๐œ•๐ฟ

๐œ•๐œ†๐‘—= ๐‘”๐‘— ๐‘‹ = 0

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Lagrange Multipliers

Sufficient condition for general problem

๐ป =

๐ฟ11 ๐ฟ12 ๐ฟ13๐ฟ21 ๐ฟ22 ๐ฟ23โ‹ฎ โ‹ฎ โ‹ฎ

โ€ฆ ๐ฟ1๐‘› ๐‘”11โ€ฆ ๐ฟ2๐‘› ๐‘”12โ‹ฎ โ‹ฎ โ‹ฎ

๐‘”21 โ€ฆ ๐‘”๐‘š1๐‘”22 โ€ฆ ๐‘”๐‘š2โ‹ฎ โ‹ฎ โ‹ฎ

๐ฟ๐‘›1 ๐ฟ๐‘›2 ๐ฟ๐‘›3๐‘”11 ๐‘”12 ๐‘”13๐‘”21 ๐‘”22 ๐‘”23

โ€ฆ ๐ฟ๐‘›๐‘› ๐‘”1๐‘›โ€ฆ ๐‘”1๐‘› 0โ‹ฎ ๐‘”2๐‘› โ‹ฎ

๐‘”1๐‘› โ€ฆ ๐‘”๐‘š๐‘›0 โ€ฆ 0โ‹ฎ โ‹ฎ โ‹ฎ

๐‘”31 ๐‘”32 ๐‘”33โ‹ฎ โ‹ฎ โ‹ฎ๐‘”๐‘š1 ๐‘”๐‘š2 ๐‘”๐‘š3

โ€ฆ ๐‘”3๐‘› 0โ€ฆ โ‹ฎ โ‹ฎโ€ฆ ๐‘”2๐‘› 0

โ€ฆ โ€ฆ โ€ฆโ€ฆ โ€ฆ โ€ฆโ€ฆ โ€ฆ 0

The hessian matrix is Where,

๐ฟ๐‘–๐‘—=๐œ•2๐ฟ

๐œ•๐‘ฅ๐‘–๐œ•๐‘ฅ๐‘—

๐‘”๐‘–๐‘—=๐œ•๐‘”๐‘–

๐œ•๐‘ฅ๐‘—

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Lagrange Multipliers

๐‘“ ๐‘‹Min/Max

Subject to ๐‘” ๐‘‹ = ๐‘ Or, ๐‘ โˆ’ ๐‘” ๐‘‹ = 0

Applying necessary conditions

๐œ•๐‘“

๐œ•๐‘ฅ๐‘–โˆ’ ๐œ†๐œ•๐‘”

๐œ•๐‘ฅ๐‘–= 0

Where ๐‘‹ = ๐‘ฅ1, ๐‘ฅ2, ๐‘ฅ3, โ€ฆ , ๐‘ฅ๐‘›๐‘‡

Where, ๐‘– = 1,2,3, โ€ฆ , ๐‘›

๐‘ โˆ’ ๐‘” = 0

Further ๐‘‘๐‘ โˆ’ ๐‘‘๐‘” = 0

๐‘‘๐‘ = ๐‘‘๐‘” =

๐‘–=1

๐‘›๐œ•๐‘”

๐œ•๐‘ฅ๐‘–๐‘‘๐‘ฅ๐‘–

๐œ•๐‘”

๐œ•๐‘ฅ๐‘–=

๐œ•๐‘“๐œ•๐‘ฅ๐‘–๐œ†

๐‘‘๐‘ =

๐‘–=1

๐‘›1

๐œ†

๐œ•๐‘“

๐œ•๐‘ฅ๐‘–๐‘‘๐‘ฅ๐‘–

๐‘‘๐‘ =๐‘‘๐‘“

๐œ†

๐œ† =๐‘‘๐‘“

๐‘‘๐‘

๐‘‘๐‘“ = ๐œ†๐‘‘๐‘

There may be three conditions

๐œ† โˆ— > 0๐œ† โˆ— < 0๐œ† โˆ— = 0

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Multivariable problem with inequality constraints

๐‘“ ๐‘‹Minimize

Subject to ๐‘”๐‘— ๐‘‹ โ‰ค 0 ๐‘— = 1,2,3, โ€ฆ ,๐‘š

Where ๐‘‹ = ๐‘ฅ1, ๐‘ฅ2, ๐‘ฅ3, โ€ฆ , ๐‘ฅ๐‘›๐‘‡

We can write ๐‘”๐‘— ๐‘‹ + ๐‘ฆ๐‘—2 = 0

Thus the problem can be written as

๐‘“ ๐‘‹Minimize

Subject to ๐บ๐‘— ๐‘‹, ๐‘Œ = ๐‘”๐‘— ๐‘‹ + ๐‘ฆ๐‘—2 = 0 ๐‘— = 1,2,3, โ€ฆ ,๐‘š

Where Y= ๐‘ฆ1, ๐‘ฆ2, ๐‘ฆ3, โ€ฆ , ๐‘ฆ๐‘š๐‘‡

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Multivariable problem with inequality constraints

๐‘“ ๐‘‹Minimize

Subject to ๐บ๐‘— ๐‘‹, ๐‘Œ = ๐‘”๐‘— ๐‘‹ + ๐‘ฆ๐‘—2 = 0 ๐‘— = 1,2,3, โ€ฆ ,๐‘š

Where ๐‘‹ = ๐‘ฅ1, ๐‘ฅ2, ๐‘ฅ3, โ€ฆ , ๐‘ฅ๐‘›๐‘‡

The Lagrange function can be written as

๐ฟ ๐‘‹, ๐‘Œ, ๐œ† = ๐‘“ ๐‘‹ +

๐‘—=1

๐‘š

๐œ†๐‘—๐บ๐‘— ๐‘‹, ๐‘Œ

The necessary conditions of optimality can be written as

๐œ•๐ฟ ๐‘‹, ๐‘Œ, ๐œ†

๐œ•๐‘ฅ๐‘–=๐œ•๐‘“ ๐‘‹

๐œ•๐‘ฅ๐‘–+

๐‘—=1

๐‘š

๐œ†๐‘—๐œ•๐‘”๐‘— ๐‘‹

๐œ•๐‘ฅ๐‘–= 0

๐œ•๐ฟ ๐‘‹, ๐‘Œ, ๐œ†

๐œ•๐œ†๐‘—= ๐บ๐‘— ๐‘‹, ๐‘Œ = ๐‘”๐‘— ๐‘‹ + ๐‘ฆ๐‘—

2 = 0

๐‘– = 1,2,3, โ€ฆ , ๐‘›

๐‘— = 1,2,3, โ€ฆ ,๐‘š

๐œ•๐ฟ ๐‘‹,๐‘Œ,๐œ†

๐œ•๐‘ฆ๐‘—=2๐œ†๐‘—๐‘ฆ๐‘—=0 ๐‘— = 1,2,3, โ€ฆ ,๐‘š

Rajib Bhattacharjya, IITG CE 602: Optimization Method

๐œ•๐ฟ ๐‘‹,๐‘Œ,๐œ†

๐œ•๐‘ฆ๐‘—= 2๐œ†๐‘—๐‘ฆ๐‘—=0

Multivariable problem with inequality constraints

From equation

Either ๐œ†๐‘—= 0 Or, ๐‘ฆ๐‘—= 0

If ๐œ†๐‘—= 0, the constraint is not active, hence can be ignored

If ๐‘ฆ๐‘—= 0, the constraint is active, hence have to consider

Now, consider all the active constraints, Say set ๐ฝ1 is the active constraintsAnd set ๐ฝ2 is the active constraints

The optimality condition can be written as๐œ•๐‘“ ๐‘‹

๐œ•๐‘ฅ๐‘–+

๐‘—โˆˆ๐ฝ1

๐œ†๐‘—๐œ•๐‘”๐‘— ๐‘‹

๐œ•๐‘ฅ๐‘–= 0 ๐‘– = 1,2,3, โ€ฆ , ๐‘›

๐‘”๐‘— ๐‘‹ = 0

๐‘”๐‘— ๐‘‹ + ๐‘ฆ๐‘—2 = 0

๐‘— โˆˆ ๐ฝ1

๐‘— โˆˆ ๐ฝ2

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Multivariable problem with inequality constraints

โˆ’๐œ•๐‘“

๐œ•๐‘ฅ๐‘–= ๐œ†1๐œ•๐‘”1๐œ•๐‘ฅ๐‘–+ ๐œ†2๐œ•๐‘”2๐œ•๐‘ฅ๐‘–+ ๐œ†3๐œ•๐‘”3๐œ•๐‘ฅ๐‘–+โ‹ฏ+ ๐œ†๐‘

๐œ•๐‘”๐‘๐œ•๐‘ฅ๐‘–

๐‘– = 1,2,3, โ€ฆ , ๐‘›

โˆ’๐›ป๐‘“ = ๐œ†1๐›ป๐‘”1 + ๐œ†2๐›ป๐‘”2 + ๐œ†3๐›ป๐‘”3 +โ‹ฏ+ ๐œ†๐‘š๐›ป๐‘”๐‘š

๐›ป๐‘“ =

๐œ•๐‘“๐œ•๐‘ฅ1

๐œ•๐‘“๐œ•๐‘ฅ2โ‹ฎ

๐œ•๐‘“๐œ•๐‘ฅ๐‘›

๐›ป๐‘”๐‘— =

๐œ•๐‘”๐‘—๐œ•๐‘ฅ1

๐œ•๐‘”๐‘—๐œ•๐‘ฅ2โ‹ฎ

๐œ•๐‘”๐‘—๐œ•๐‘ฅ๐‘›

This indicates that negative of the gradient of the objective function can be expressed as a linear combination of the gradients of the active constraints at optimal point.

โˆ’๐›ป๐‘“ = ๐œ†1๐›ป๐‘”1 + ๐œ†2๐›ป๐‘”2

Let ๐‘† be a feasible direction, then we can write

โˆ’๐‘†๐‘‡๐›ป๐‘“ = ๐œ†1๐‘†๐‘‡๐›ป๐‘”1 + ๐œ†2๐‘†

๐‘‡๐›ป๐‘”2

Since ๐‘† is a feasible direction ๐‘†๐‘‡๐›ป๐‘”1 < 0 and ๐‘†๐‘‡๐›ป๐‘”2 < 0

If ๐œ†1, ๐œ†2 > 0Then the term ๐‘†๐‘‡๐›ป๐‘“ is +ve

This indicates that ๐‘† is a direction of increasing function value

Thus we can conclude that if ๐œ†1, ๐œ†2 > 0, we will not get any better solution than the current solution

Rajib Bhattacharjya, IITG CE 602: Optimization Method

๐‘”1 ๐‘‹ = 0

๐‘”2 ๐‘‹ = 0

๐‘”1 ๐‘‹ < 0

๐‘”2 ๐‘‹ < 0

๐‘”1 ๐‘‹ > 0

๐‘”2 ๐‘‹ > 0

๐›ป๐‘”1

๐›ป๐‘”2

๐‘†

Infeasible region

Infeasible region

Feasible region

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Multivariable problem with inequality constraints

The necessary conditions to be satisfied at constrained minimum points ๐‘‹โˆ— are

๐œ•๐‘“ ๐‘‹

๐œ•๐‘ฅ๐‘–+

๐‘—โˆˆ๐ฝ1

๐œ†๐‘—๐œ•๐‘”๐‘— ๐‘‹

๐œ•๐‘ฅ๐‘–= 0 ๐‘– = 1,2,3, โ€ฆ , ๐‘›

๐œ†๐‘— โ‰ฅ 0 ๐‘— โˆˆ ๐ฝ1

These conditions are called Kuhn-Tucker conditions, the necessary conditions to be satisfied at a relative minimum of ๐‘“ ๐‘‹ .

These conditions are in general not sufficient to ensure a relative minimum, However, in case of a convex problem, these conditions are the necessary and sufficient conditions for global minimum.

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Multivariable problem with inequality constraints

If the set of active constraints are not known, the Kuhn-Tucker conditions can be stated as

๐œ•๐‘“ ๐‘‹

๐œ•๐‘ฅ๐‘–+

๐‘—=1

๐‘š

๐œ†๐‘—๐œ•๐‘”๐‘— ๐‘‹

๐œ•๐‘ฅ๐‘–= 0 ๐‘– = 1,2,3, โ€ฆ , ๐‘›

๐œ†๐‘—๐‘”๐‘— = 0

๐œ†๐‘— โ‰ฅ 0

๐‘”๐‘— โ‰ค 0 ๐‘— = 1,2,3, โ€ฆ ,๐‘š

Rajib Bhattacharjya, IITG CE 602: Optimization Method

Multivariable problem with equality and inequality constraints

For the problem

๐œ•๐‘“ ๐‘‹

๐œ•๐‘ฅ๐‘–+

๐‘—=1

๐‘š

๐œ†๐‘—๐œ•๐‘”๐‘— ๐‘‹

๐œ•๐‘ฅ๐‘–+

๐‘˜=1

๐‘

๐›ฝ๐‘˜๐œ•โ„Ž๐‘˜ ๐‘‹

๐œ•๐‘ฅ๐‘–= 0 ๐‘– = 1,2,3, โ€ฆ , ๐‘›

๐œ†๐‘—๐‘”๐‘— = 0

๐œ†๐‘— โ‰ฅ 0

๐‘”๐‘— โ‰ค 0

๐‘— = 1,2,3, โ€ฆ ,๐‘š

๐‘“ ๐‘‹Minimize

Subject to ๐‘”๐‘— ๐‘‹ = 0 ๐‘— = 1,2,3, โ€ฆ ,๐‘š

Where ๐‘‹ = ๐‘ฅ1, ๐‘ฅ2, ๐‘ฅ3, โ€ฆ , ๐‘ฅ๐‘›๐‘‡

๐‘˜๐‘˜ ๐‘‹ = 0 ๐‘˜ = 1,2,3, โ€ฆ , ๐‘

The Kuhn-Tucker conditions can be written as

๐‘— = 1,2,3, โ€ฆ ,๐‘š

โ„Ž๐‘˜ = 0

๐‘— = 1,2,3, โ€ฆ ,๐‘š

๐‘˜ = 1,2,3, โ€ฆ , ๐‘

Rajib Bhattacharjya, IITG CE 602: Optimization Method

top related