Optimization methods Review

Post on 08-Jan-2016

83 views 1 download

Tags:

description

Optimization methods Review. Mateusz Sztangret. Faculty of Metal Engineering and Industrial Computer Science Department of Applied Computer Science and Modelling Krakow, 03-11-2010 r. Outline of the presentation. Basic concepts of optimization Review of optimization methods - PowerPoint PPT Presentation

transcript

Optimization methodsReview

Mateusz Sztangret

Faculty of Metal Engineering and Industrial Computer ScienceDepartment of Applied Computer Science and Modelling

Krakow, 03-11-2010 r.

1

Outline of the presentation

Basic concepts of optimizationReview of optimization methods• gradientless methods,• gradient methods,• linear programming methods,• non-deterministic methodsCharacteristics of selected methods• method of steepest descent• genetic algorithm

2

Basic concepts of optimization

Man’s longing for perfection finds expression in the theory of optimization. It studies how to describe and attain what is Best, once one knows how to measure and alter what is Good and Bad… Optimization theory encompasses the quantitative study of optima and methods for finding them.

Beightler, Phillips, WildeFoundations of Optimization

3

Basic concepts of optimization

Optimization /optimum/ - process of finding the best solution

Usually the aim of the optimization is to find better solution than previous attached

4

Basic concepts of optimization

Specification of the optimization problem:• definition of the objective function,• selection of optimization variables,• identification of constraints.

5

Mathematical definition

where:• x is the vector of variables, also called unknowns

or parameters;• f is the objective function, a (scalar) function of x

that we want to maximize or minimize;

• gi and hi are constraint functions, which are scalar functions of x that define certain equations and inequalities that the unknown vector x must satisfy.

kixh

kixgtosubjectxf

i

i

Rx n ...1,0

...1,0min

6

Set of allowed solutions

Constrain functions define the set of allowed solution that is a set of points which we consider in the optimization process.

XX d dXx

X

Xd

7

Obtained solution

Solution is called global minimum if, for allSolution is called local minimum if there is a neighbourhood N of such that for all

Global minimum as well as local minimum is never exact due to limited accuracy of numerical methods and round off error

*x

dXx xfxf *

*x*x xfxf *

Nx

8

Local and global solutions

f(x)

x

local minimumglobal minimum

9

Problems with multimodal objective function

f(x)

x

startstart

10

Discontinuous objective function

f(x)

x

32

3422

xdla

xdlaxxxf

3

Discontinuous function

11

Minimum or maximum

f(x)

x

f

– f

x*

c

– c

12

General optimization flowchart

Start

Set starting point x(0)

Stop condition

Stop

i = 0

Calculate f(x(i))i = i + 1

x(i+1) = x(i) + Δx(i)

YES

NO

13

Stop conditions

Commonly used stop conditions are as follows:

• obtain sufficient solution,• lack of progress,• reach the maximum number of iterations

14

Classification of optimization methods

15

Optimization methods

The are several type of optimization algorithms:

• gradientless methods,– line search methods,– multidimensional methods,

• gradient methods,• linear programming methods,• non-deterministic methods

16

Gradientless methods

• Line search methods– Expansion method– Golden ratio method

• Multidimensional methods– Fibonacci method– Method based on Lagrange interpolation– Hooke-Jeeves method– Rosenbrock method– Nelder-Mead simplex method– Powell method

17

Features of gradientless methods

Advantages:• simplicity,• they do not require computing derivatives

of the objective function.

Disadvantages:• they find first obtained minimum• they demand unimodality and continuity

of objective function

18

Gradient methods

• Method of steepest descent• Conjugate gradients method• Newton method• Davidon-Fletcher-Powell method• Broyden-Fletcher-Goldfarb-Shanno method

19

Features of gradient methods

Advantages:• simplicity,• greater effciency in comparsion with

gradientless methods.

Disadvantages:• they find first obtained minimum• they demand unimodality, continuity and

differentiability of objective function

20

Linear programming

If both the objective function and constraints are linear we can use one of the linear programming method:

• Graphical method• Simplex method

21

Non-deterministic method

• Monte Carlo method• Genetic algorithms• Evolutionary algorithms

– strategy (1 + 1)– strategy (μ + λ)– strategy (μ, λ)

• Particle swarm optimization• Simulated annealing method• Ant colony optimization• Artificial immune system

22

Features of non-deterministic methods

Advantages:• any nature of optimised objective function,• they do not require computing derivatives

of the objective function.

Disadvantages:• high number of objective function calls

23

Optimization with constraints

Ways of integrating constrains

• External penalty function method• Internal penalty function method

24

Multicriteria optimization

In some cases solved problem is defined by few objective function. Usually when we improve one the others get whose.

• weighted criteria method• ideal point method

25

Weighted criteria method

Method involves the transformationmulticriterial problem intoone-criterial problem by addingparticular objective functions.

m

kkkmS fwff

11 ,..., xxx

26

Ideal point method

In this method we choose an ideal solution which is outside the set of allowed solution and the searching optimal solution inside the set of allowed solutionwhich is closest the the ideal point. Distance we canmeasure using various metrics

i

iz 2z ii

zmaxz

Ideal point

Allowed solution

27

Method of steepest descent

Algorithm consists of following steps:1. Substitute data:

– u0 – starting point

– maxit – maximum number of iterations– e – require accuracy of solution– i = 0 – iteration number

2. Compute gradient in ui

n

i

i

i

u

uQ

u

uQ

Q

)(:

:

)(

1

28

Method of steepest descent

3. Choose the search direction

4. Find optimal solution along the chosen direction (using any line search method).

5. If stop conditions are not satisfied increased i and go to step 2.

iQid

29

Zigzag effect

Let’s consider a problemof finding minimumof function:f(u)=u1

2+3u22

Starting point:u0=[-2 3]

Isolines

30

Genetic algorithm

Algorithm consists of following steps:1. Creation of a baseline population.2. Compute fitness of whole population3. Selection.4. Crossing.5. Mutation.6. If stop conditions are not satisfied go to

step 2.

31

Creation of a baseline population

Genotype

1 0 1 0 1 0 1 00 1 0 1 0 1 0 11 1 0 1 0 1 0 01 0 1 1 0 1 1 00 0 1 0 1 0 1 11 1 1 0 0 1 0 0

Objective function value (f(x)=x2)

289007225

44944331241849

51984

32

Selection

Baseline population

1 0 1 0 1 0 1 00 1 0 1 0 1 0 11 1 0 1 0 1 0 01 0 1 1 0 1 1 00 0 1 0 1 0 1 11 1 1 0 0 1 0 0

Parents’ population

1 1 1 0 0 1 0 01 1 0 1 0 1 0 01 1 1 0 0 1 0 00 1 0 1 0 1 0 11 0 1 1 0 1 1 01 0 1 0 1 0 1 0

33

Roulette wheel method

34

Crossing

Parent individual no 1

1 0 1 0 1

Parent individual no 2

0 1 0 1 0

crossing point

Descendant individual no 1

0 1 0

Descendant individual no 2

1 0 1

35

Mutation

Parent individual1 0 1 0 1 0 1 0

36

Mutation

Mutation 1 0 1 0 1 0 1 0

r>pmr>pm r<pm

37

Mutation

Mutation 1 0 0 0 1 0 1 0

r<pm r>pmr>pm r>pmr<pm

38

Mutation

Mutation 1 0 0 0 1 0 0 0

r>pmr<pm

39

Mutation

Parent individual 1 0 1 0 1 0 1 0

Descendant individual 1 0 0 0 1 0 0 0

40

Genetic algorithm

After mutation, completion individuals are recorded in the descendant population, which becomes the baseline population for the next algorithm iteration.

If obtained solution satisfies stop condition procedure is terminated. Otherwise selection, crossing and mutation are repeated.

41

Thank you for your attention!

42