lecture 1 an introduction to optimization -- โ€ฆqf-zhao/teaching/mh/lec01.pdfun-constrained...

18
Lecture 1 An Introduction to Optimization โ€“ Classification and Case Study An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/1

Upload: others

Post on 24-Jan-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Lecture 1An Introduction to Optimization โ€“

Classification and Case Study

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/1

Page 2: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Un-constrained Optimization

โ€ข Generally speaking, an optimization problem has an objective function f(x).

โ€ข The problem is represented by

min(max) ๐‘“(๐‘ฅ), ๐‘“๐‘œ๐‘Ÿ ๐‘Ž๐‘™๐‘™ ๐‘ฅ

โ€ข This is called an un-constrained optimization problem (็„กๅˆถ็ด„ๆœ€้ฉๅŒ–ๅ•้กŒ).

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/2

Page 3: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Un-constrained Optimization

โ€ข Usually, ๐‘ฅ is a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘…๐‘, and ๐‘“(๐‘ฅ) is a point in ๐‘…๐‘€.

โ€ข In this course, we study only the case in which ๐‘€ = 1. That is, we have only one objective to optimize.

โ€ข Some special considerations are needed to extend the results obtained here to โ€œmultiple objectiveโ€ cases.

โ€ข Interested students may also study optimization in โ€œnon-Euclideanโ€ spaces (i.e. manifolds).

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/3

Page 4: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Constrained Optimization

โ€ข The domain can be a sub-space ๐ท of ๐‘…๐‘.

โ€ข We have constrained optimization problem:

โ€ข ๐ท again can be defined by some functions

โ€“ ๐‘ฅ๐‘– > 0, ๐‘– = 1,2, โ€ฆ

โ€“ ๐‘”๐‘—(๐‘ฅ) > 0, ๐‘— = 1,2, โ€ฆ

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Subject to

Lec01/4

โ€ข min ๐‘š๐‘Ž๐‘ฅ ๐‘“ ๐‘ฅ

โ€ข ๐‘ . ๐‘ก. ๐‘ฅ โˆˆ ๐ท

Page 5: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Linear programming(็ทšๅž‹่จˆ็”ปๆณ•)

โ€ข If both ๐‘“(๐‘ฅ) and ๐‘”๐‘—(๐‘ฅ) are linear functions, we have linear optimization problem, and this is usually called linear programming (LP).

โ€ข For LP, we have very efficient algorithms already, and meta-heuristics are not needed.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/5

Page 6: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Non-linear programming(้ž็ทšๅฝข่จˆ็”ปๆณ•)

โ€ข If ๐‘“(๐‘ฅ) or any ๐‘”๐‘—(๐‘ฅ) is non-linear, we have non-linear optimization problem, and this is often called non-linear programming (NLP).

โ€ข Many methods have been proposed to solve this class of problems.

โ€ข However, conventional methods usually finds local optimal solutions. Meta-heuristic methods are useful for finding global solutions.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/6

Page 7: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Local optimal and global optimal

โ€ข For minimization problem, โ€“ A solution ๐‘ฅโˆ— is local optimal if ๐‘“(๐‘ฅโˆ—) < ๐‘“(๐‘ฅ) for all ๐‘ฅ in

the ๐œ€-neighborhood of ๐‘ฅโˆ—, where ๐œ– > 0 is a real number, and is the radius of the neighborhood.

โ€“ A solution ๐‘ฅโˆ— is global optimal if ๐‘“(๐‘ฅโˆ—) < ๐‘“(๐‘ฅ) for all ๐‘ฅin the search space (problem domain).

โ€ข Meta-heuristics are useful for obtaining global optimal solutions efficiently.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/7

Page 8: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Example 1: Linear Programming

โ€ข 2 materials are used for making two products. โ€ข The prices of the products are 25 and 31 (in million yen), and those

of the materials are 0.5 and 0.8 (in million yen). โ€ข Suppose that we produce x1 units for product1, and x2 units for

product2.โ€ข We can get 25*x1+31*x2 million yen by selling the products.โ€ข On the other hand, we must pay (7*x1+5*x2)*0.5 +

(4*x1+8*x2)*0.8 million yen to buy the materials.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Material used in Product1 Material used in Product2

Material 1 7 5

Material 2 4 8

Lec01/8

Page 9: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Example 1: Linear Programming

โ€ข The problem can be formulated as follows:

max ๐‘“(๐‘ฅ1, ๐‘ฅ2) = 18.3๐‘ฅ1 + 22.1๐‘ฅ2๐‘ . ๐‘ก. ๐‘ฅ1 > 0; ๐‘ฅ2 > 0;

6.7๐‘ฅ1 + 8.9๐‘ฅ2 < ๐ต

โ€ข The first set of constraints means that both products should be produced to satisfy social needs; and the second constraint is the budget limitation.

โ€ข This is a typical linear programming problem, and can be solved efficiently using the well-known simplex algorithm.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/9

Page 10: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Example 2: Non-linear programming

โ€ข Given ๐‘ observations: (๐‘ฅ1, ๐‘(๐‘ฅ1)), (๐‘ฅ2, ๐‘(๐‘ฅ2)), โ€ฆ ,(๐‘ฅ๐‘, ๐‘(๐‘ฅ๐‘)) of an unknown function ๐‘(๐‘ฅ).

โ€ข Find a polynomial ๐‘ž(๐‘ฅ) = ๐‘Ž0+ ๐‘Ž1๐‘ฅ + ๐‘Ž2๐‘ฅ2, such that

min ๐‘“ ๐‘Ž0, ๐‘Ž1, ๐‘Ž2 =

๐‘–=1

๐‘

๐‘ ๐‘ฅ๐‘– โˆ’ ๐‘ž ๐‘ฅ๐‘– + ๐œ† ๐‘ž(๐‘ฅ)

โ€ข Note that in this problem ๐‘ž(๐‘ฅ) is also a function of ๐‘Ž0, ๐‘Ž1, ๐‘Ž๐‘›๐‘‘ ๐‘Ž2.

โ€ข The first term is the approximation error, and the second term is regularization factor that can make the solution better (e.g. smoother).

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/10

Page 11: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Combinatorial optimization problems

โ€ข If ๐‘“(๐‘ฅ) or ๐‘”๐‘—(๐‘ฅ) cannot be given analytically (in closed-form), we have combinatorial problems.

โ€ข For example, if ๐‘ฅ takes ๐‘˜ discrete values (e.g. integers), and if there are ๐พ variables, the number of all possible solutions will be ๐‘˜๐พ.

โ€ข It is difficult to check all possible solutions in order to find the best one(s).

โ€ข In such cases, meta-heuristics can provide efficient ways for obtaining good solutions using limited resources (e.g. time and memory space).

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/11

Page 12: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Example 3: Traveling salesman problem (TSP)

โ€ข Given ๐‘ users located in ๐‘different places (cities).

โ€ข The problem is to find a route so that the salesman can visit all users once (and only once), start from and return to his own place (to find the Hamiltonian cycle).

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

From Wikipedia

Lec01/12

Page 13: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Example 3: Traveling salesman problem (TSP)

โ€ข In TSP, we have a route map which can be represented by a graph.

โ€ข Each node is a user, and the edge between each pair of nodes has a cost (distance or time).

โ€ข The evaluation function to be minimized is the total cost of the route.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

From Wikipedia

Lec01/13

For TSP, the number of all possible solutions is ๐‘!, and this is a well-known NP-hard combinatorial problem.

Page 14: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

NP-hard and NP-complete

โ€ข Problems that can be solved by a deterministic algorithm in polynomial time is called class P.

โ€ข NP is a class of decision problems that can be solved by a non-deterministic algorithm in polynomial time.

โ€ข A problem H is NP-hard if it is at least as hard as any NP problem.

โ€ข NP-hard decision problems are NP-complete.

โ€ข NP-complete and NP-hard problems can be solved more efficiently if we use meta-heuristics.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/14

P

NP-complete

NP

NP-hard

Page 15: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Example 4: The Knapsack problem

โ€ข Knapsack problem is another NP-hard problem defined by:โ€“ There are ๐‘ objects;

โ€“ Each object has a weight and a value;

โ€“ The knapsack has a capacity;

โ€“ The user has a quota (minimum desired value);

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/15

The problem is to find a sub-set of the objects that can be put into the knapsack and can maximize the total value.

Page 16: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Example 4: The Knapsack problem KNAPSACK (in OS : set of objects; QUOTA : number; CAPACITY : number;

out S : set of objects; FOUND : boolean) Begin S := empty;

total_value := 0; total_weight := 0; FOUND := false; pick an order L over the objects; loop

choose an object O in L; add O to S; total_value:= total_value + O.value; total_weight:= total_weight + O.weight; if total_weight > CAPACITY then fail

else if total_value > = QUOTA FOUND:= true; succeed;

end enddelete all objects up to O from L;

end end

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/16

This is a non-deterministic algorithm. Each time we run the program, we get a different answer. By chance, we may get the best answer.

Page 17: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Example 5: Learning problems

โ€ข Many optimization problems related to machine learning (learning from a given set of training data) are NP-hard/complete.

โ€ข Examples included:โ€“ Finding the smallest feature sub-set;โ€“ Finding the most informative training

data set;โ€“ Finding the smallest decision tree;โ€“ Finding the best clusters;โ€“ Finding the best neural network;โ€“ Interpret a learned neural network.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/17

Page 18: Lecture 1 An Introduction to Optimization -- โ€ฆqf-zhao/TEACHING/MH/Lec01.pdfUn-constrained Optimization โ€ขUsually, ๐‘ฅis a โ€œpointโ€ in an N-dimensional Euclidean space ๐‘… ,

Homework

โ€ข Try to find some other examples of optimization problems (at least two) from the Internet.

โ€ข Tell if the problems are NP-hard, NP-complete, NP, or P.

โ€ข Provide a solution (not necessarily the best one) for each of the problems.

โ€ข Summarize your answer using a pdf-file, and submit the printed copy before the class of next week.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C) Lec01/18