+ All Categories
Home > Documents > [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary...

[American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary...

Date post: 09-Dec-2016
Category:
Upload: eliot
View: 213 times
Download: 0 times
Share this document with a friend
10
American Institute of Aeronautics and Astronautics 1 AN ANALYTICAL CURVE BASED APPROACH FOR MULTI-MODAL OPTIMIZATION T. Malik and E.H. Winer New York State Center for Engineering Design and Industrial Innovation (NYSCEDII) Department of Mechanical and Aerospace Engineering University at Buffalo Abstract In this paper, the development of a new optimization solution algorithm for multi-modal problems is discussed. The algorithm uses analytical equations of cyclic curves to examine a design space and identify all possible regions where minima may exist. Each of these sub-regions is then further examined with analytical curves and subsequently reduced. These explorations continue until each identified sub-region converges to a solution or is discarded. Criteria for how the curves are developed, sub-regions are located, sub-region s are reduced, convergence, and sub-region elimination are all controlled by the designer. This paper details the development of this concept, implementation into a computer code, and results on tests performed on example problems. Results are also presented for a preliminary study of how changing certain algorithm specific parameters may affect solution accuracy, quality, and efficiency. Introduction The ever-increasing demand to design new, better, more efficient, and less expensive products and processes has prompted engineers to look for new technologies and methodologies to improve decision making. One such area is optimization. In most general terms, optimization theory is a body of numerical or heuristic methods for finding and identifying the best solution among various design points without evaluating every alternative. Optimization techniques, having reached a degree of maturity over the past several years, are being used in a wide spectrum of industries. Optimization, in its broadest sense, can be applied to solve any engineering problem such as optimum design of chemical processing equipment and plants 1 , optimum design of linkages, cams, gears, machine tools, and other mechanical components 2 , and shape optimization in steel and concrete industries 3 . Background In this section, a brief review of various optimization methods will be provided. Since the aim of this work was to develop an algorithm for finding the global optimum for multimodal problems, this chapter will focus mostly on research performed for these kinds of problems. Extensive literature surveys of early research work in global optimization provide a background for most of the original deterministic and stochastic global optimization methods 4, 5 . Numerical Optimization Techniques In assessing the value of optimization techniques to engineering design, it is worthwhile to review briefly traditional optimization approaches. Numerical optimization techniques help us to devise a rational, directed design procedure. Although the methods provide a computational tool for design, there is much more to be gained from this study. It gives us an ordered approach to design decisions, where before we relied heavily on intuition and experience. Numerical optimization can reduce the design cycle time and be part of a systematic logical design procedure. Using these techniques we can deal with a wide variety of design variables and constraints, which are otherwise difficult with which to interact. However, it can seldom be guaranteed that the optimization algorithm will obtain the global optimum in a multimodal problem. Therefore, it may be necessary to start the optimization process from several different initial points to obtain reasonable assurance of obtaining the global optimum. Many times, algorithms tend to become trapped in sub- optimal regions within multimodal problems. In addition, as the number of design variables and constraints for a given problem increases, so does solution time as well as the possibility of solution divergence. Traditional search and numerical optimization methods can be classified into two groups: direct and gradient- based methods. In direct or zero order methods, only the objective function and constraints are used to guide the search. Since gradient information is not used, these methods are typically slow and require more number of function evaluations for convergence 6 . Gradient-based methods utilize first and/or second-order derivative information for the search process. Although these methods commonly show faster convergence, they are not effective for solving non-differentiable or discontinuous design problems and can be computationally expensive. There are some common drawbacks associated with most traditional direct and gradient-based methods 6 : 1.) An algorithm efficient in solving one optimization problem may not be efficient in solving another problem 2.) Chosen initial design 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization 4-6 September 2002, Atlanta, Georgia AIAA 2002-5520 Copyright © 2002 by the author(s). Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.
Transcript
Page 1: [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization - Atlanta, Georgia ()] 9th AIAA/ISSMO Symposium on Multidisciplinary

American Institute of Aeronautics and Astronautics1

AN ANALYTICAL CURVE BASED APPROACH FOR MULTI-MODAL OPTIMIZATION

T. Malik and E.H. WinerNew York State Center for Engineering Design and Industrial Innovation (NYSCEDII)

Department of Mechanical and Aerospace EngineeringUniversity at Buffalo

AbstractIn this paper, the development of a new optimization solution algorithm for multi-modal problems is discussed. The algorithm uses analytical equations of cyclic curves to examine a design space and identify all possible regions where minima may exist. Each of these sub-regions is then further examined with analytical curves and subsequently reduced. These explorations continue until each identified sub-region converges to a solution or is discarded. Criteria for how the curves are developed, sub-regions are located, sub-region s are reduced, convergence, and sub-region elimination are all controlled by the designer. This paper details the development of this concept, implementation into a computer code, and results on tests performed on example problems. Results are also presented for a preliminary study of how changing certain algorithm specific parameters may affect solution accuracy, quality, and efficiency.

IntroductionThe ever-increasing demand to design new, better, more efficient, and less expensive products and processes has prompted engineers to look for new technologies and methodologies to improve decision making. One such area is optimization. In most general terms, optimization theory is a body of numerical or heuristic methods for finding and identifying the best solution among various design points without evaluating every alternative. Optimization techniques, having reached a degree of maturity over the past several years, are being used in a wide spectrum of industries. Optimization, in its broadest sense, can be applied to solve any engineering problem such as optimum design of chemical processing equipment and plants1, optimum design of linkages, cams, gears, machine tools, and other mechanical components2, and shape optimization in steel and concrete industries3.

BackgroundIn this section, a brief review of various optimization methods will be provided. Since the aim of this work was to develop an algorithm for finding the global optimum for multimodal problems, this chapter will focus mostly on research performed for these kinds of problems. Extensive literature surveys of early research work in global optimization provide a background for

most of the original deterministic and stochastic global optimization methods4, 5.

Numerical Optimization TechniquesIn assessing the value of optimization techniques to engineering design, it is worthwhile to review briefly traditional optimization approaches. Numerical optimization techniques help us to devise a rational, directed design procedure. Although the methods provide a computational tool for design, there is much more to be gained from this study. It gives us an ordered approach to design decisions, where before we relied heavily on intuition and experience. Numerical optimization can reduce the design cycle time and be part of a systematic logical design procedure. Using these techniques we can deal with a wide variety of design variables and constraints, which are otherwise difficult with which to interact. However, it can seldom be guaranteed that the optimization algorithm will obtain the global optimum in a multimodal problem. Therefore, it may be necessary to start the optimization process from several different initial points to obtain reasonable assurance of obtaining the global optimum. Many times, algorithms tend to become trapped in sub-optimal regions within multimodal problems. In addition, as the number of design variables and constraints for a given problem increases, so does solution time as well as the possibility of solution divergence.

Traditional search and numerical optimization methods can be classified into two groups: direct and gradient-based methods. In direct or zero order methods, only the objective function and constraints are used to guide the search. Since gradient information is not used, these methods are typically slow and require more number of function evaluations for convergence6. Gradient-based methods utilize first and/or second-order derivative information for the search process. Although these methods commonly show faster convergence, they are not effective for solving non-differentiable or discontinuous design problems and can be computationally expensive. There are some common drawbacks associated with most traditional direct and gradient-based methods6: 1.) An algorithm efficient in solving one optimization problem may not be efficient in solving another problem 2.) Chosen initial design

9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization4-6 September 2002, Atlanta, Georgia

AIAA 2002-5520

Copyright © 2002 by the author(s). Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.

Page 2: [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization - Atlanta, Georgia ()] 9th AIAA/ISSMO Symposium on Multidisciplinary

American Institute of Aeronautics and Astronautics2

points have an effect on convergence to the optimal solution 3.)Many algorithms cannot locate all the local minima in a multimodal problem.

There are many numerical optimization methods available for both unconstrained and constrained optimization problems. Powell’s Method7, The Conjugate Direction Method of Fletcher and Reeves8, and the Variable Metric method9, 10 are examples of unconstrained optimization methods. Sequential Linear Programming (SLP)7, Method of Feasible Direction8, and Sequential Quadratic Programming (SQP)8 are examples of constrained optimization methods. All these optimization algorithms work on the idea of iterative searching in a direction where the objective function is decreasing. The design variables are updated at each iteration using an equation such as.

qq

qq SXX *1 α+=− (1)

Where, X is the design variable vector, Sq is the search

direction, *qα is a scalar multiplier determining the

amount of change in the design variable, and q is the iteration number.

Traditionally, non-linear programming methods were developed to obtain local minima. The methods stated earlier work well if the problem is unimodal. However, if the problem is multi-modal, the probability of locating a local solution point increases drastically. Since, there has been no mathematical criterion that can prove that a particular local minimum is also the global minimum, this makes it even harder to find the global minimum. Usually, a solution run locates a single solution point. To find all minima, multiple runs of the algorithm from differing initial points usually has to be done. Even then, finding the global optimum is not guaranteed. Further, many numerical optimization methods require derivative information. This can decrease the class of problems that can be solved (i.e. continuous functions only). In addition, obtaining derivative information can be computationally expensive thus decreasing the efficiency of a method. These issues illustrate why it is a considerable challenge to solve a multimodal optimization problem from both mathematical and computational viewpoints using traditional numerical optimization techniques.

Global OptimizationGlobal optimization was developed to deal with finding global optimal points particularly in multimodal problems. Global optimization has been an active research area for at least the past three decades4,11. Global optimization algorithms are broadly classified as

deterministic or stochastic depending on the elements used in the problem solution. Some deterministic algorithms guarantee convergence to the global minimum, but restrictions must be placed on the classof problems considered. The stochastic algorithms, on the other hand, usually guarantee convergence in an asymptotic sense. Another distinction between global optimization algorithms is the consideration of local minimum. Some analytical and numerical methods are designed to converge directly to the global minimum without concern for local minimum. While, others intend to find all local minima and then select the best as the global minimum.

The multistart approach4 is a very popular stochastic method that tries to find all local minima by starting a local minimization procedure from a set of random points uniformly distributed over a feasible design space. In general, the method has two phases: (a) local and (b) global, The multistart method, if used in its original form, can be quite inefficient as it causes many executions of local search procedures, this often results in the same local minima being obtained several times. Several variants have been proposed to improve this efficiency, such as Random Tunneling12, and Multistart with clustering13.

There are various other stochastic methods such as Pure Random Search7 and Controlled Random Search14, which rely on the power of computational hardware to find a solution efficiently.

Heuristic methods are used extensively in global optimization. In specialized applications, a heuristic method can embody insight about a problem. Heuristic methods can often be invoked to modify a given solution into a better solution, thereby playing the role of an improvement operator. Heuristic methods can be used to find the optimal decisions for designing or managing a wide range of complex systems. They usually guarantee asymptotic convergence to the global optimum. Heuristic methods are often effective, since they are robust and easy to implement and require minimal programming.

Heuristic optimization algorithms have the advantage that they do not require gradient information in the research process. Also, these methods can be implemented for complex nonlinear, nonconvex and mixed design variable problems. Heuristics derived from natural phenomena have given rise to entire search strategies. Simulated Annealing15,16,Genetic Algorithms17,18 and Tabu Search19. are some of the most popular heuristic algorithms with complementary strength and weaknesses.

Page 3: [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization - Atlanta, Georgia ()] 9th AIAA/ISSMO Symposium on Multidisciplinary

American Institute of Aeronautics and Astronautics3

Simulated Annealing15 is a one-phase stochastic global optimization technique based on an analogy of statistical mechanics for function optimization that attempts to mimic the process of thermal annealing of solids. Heating a solid to its melting point and then allowing the liquid to cool slowly enough so that thermal equilibrium is reached at each temperature accomplishes the annealing of the solid. A Simulated annealing algorithm can be thought of as a sequence of runs of versions of the Metropolis algorithm16, but with decreasing temperature. The objective function is used as the energy of the system. For each temperature, the system is allowed to reach equilibrium before reducing the temperature. In this way, a sequence of states is obtained that is distributed according to the various Boltzmann distributions for the decreasing temperatures. However, as the temperature approaches zero, the Boltzmann distribution converges to a distribution, that is completely supported on the set of global minima of the energy function. Thus, by carefully controlling the temperature and by allowing the system to come to equilibrium at each temperature, the process finds the global minima of the energy function. Although the method is conceptually simple, finding optimal parameters for the initial temperature, cooling rate, and final temperature can be very difficult. Also, if the global minimum is discarded during the initial phase of the search, it is increasingly difficult for Simulated Annealing to regain it during a later stage.

Genetic Algorithms17 are a machine learning technique modeled upon the natural process of evolution. They use a stochastic, directed and highly parallel search based on principles of population genetics that artificially evolve a solution to a given problem. The main factors that make Genetic Algorithms different from traditional methods of search and optimization are:

1. Genetic Algorithms work with a coding of the design variables as opposed to the design variables themselves.

2. Genetic Algorithms work with a population of points as opposed to a single point, thus reducing the risk of obtaining local minima.

3. Genetic Algorithms require only the objective function value, not the derivatives. This aspect makes Genetic Algorithms problem-independent.

4. Genetic Algorithms are a family of probabilistic search methods, not deterministic, making the search highly exploitative.

However, genetic algorithms do not guarantee that the solution obtained is a global minimum and can be computationally expensive to run. Genetic algorithms have been widely applied in design optimization. For

example, Hajela and Lin18 have implemented genetic search methods in multicriteria design optimization with a mix of continuous, integer and discrete design variables. The convergence of the algorithm depends on crossover and mutation rates, the generation gap, linear normalization constants, and population size.

Research IssueOptimization techniques, if used effectively, can greatly reduce engineering design time and yield improved, efficient, and economical designs. However, it is important to understand the limitations of optimization techniques. When considering a traditional numerical optimization method or a global optimization method,there are issues that must be considered prior to, during, and after obtaining a solution. It can be summarized that most numerical optimization techniques attempt to achieve convergence to a global minimum either by searching areas of the design space randomly or by following a path obtained by zero, first, or second order information. Although their applications to practical problems are limited, they offer a good theoretical understanding of the nature of global optimization problems.

Global optimization methods, on the other hand, provide effective tools for a general global optimization problem. The drawback is that they attempt to converge asymptotically and many times are computationally expensive. In addition, both numerical optimization techniques and heuristic methods generally do not provide all minima to designer, which is useful for good understanding of the problem.

In this paper the development of a zero-order global optimization algorithm is presented. Emphasis is placed on finding all minima for a given problem and returning this information to the designer. The knowledge of all minima can provide a designer better understanding of a problem. An analytical curve based algorithm is proposed for multimodal optimization. With the use of this, all possible local minima can be obtained providing better insight to the design space of a problem. Sometimes a solution obtained by a numerical optimization technique may be a good solution theoretically, but not practically.

Strategy for Multimodal Unconstrained Optimization

In the previous section, the need for developing an algorithm for multimodal optimization was discussed. The algorithm proposed in this work will locate optimal points for unimodal as well as multimodal problems using only zero order information. The initial

Page 4: [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization - Atlanta, Georgia ()] 9th AIAA/ISSMO Symposium on Multidisciplinary

American Institute of Aeronautics and Astronautics4

implementation has currently been developed and tested for unconstrained two-dimensional problems.

Consider a simple two-dimensional design space displayed on a rectangular coordinate system. Further, assume that multiple minima exist in this design space. Using the analytical curve based method the design space will first be searched to determine regions having local minima. Thereafter, regions obtained will be searched for local or global minima. In the proposed algorithm, sinusoidal curves are developed and placed over the entire design space as shown in Figure 1. The curves were created by generating a pair of end points in the space somewhere along the boundary as shown in Figure 2. Amplitude and frequency of the sinusoidalcurves can be specified by the designer according to the complexity of the design problem or the level of thoroughness required. The sinusoidal curve equation is structured such that one design variable is represented as a function of the other.

Figure 1: Sinusoidal curves in design space

Figure 2: End points used to create curve

Once the curve has been created it is necessary to be able to compute the objective function and constraints at any point on the curve. This is accomplished in the following manner. A design variable value is chosen

along its range (so long as the curve passes over it). For the two-dimensional case, the corresponding value of the other design variable is computed using Equation 2. Thus, if one of the design variables is put into Equation 2, the other design variable is easily computed.

]/[sin

/

⟩⟩−⟨⟩−∏⟨∗⟨∗++⟩−⟨⟩−⟨∗⟩−⟨=

ABAA

ABAAB

XXXXnAY

XXXXYYY(2)

Where:XA, XB are the end points of one design variable.YA, YB are the end points of the other design variable.A is the amplitude.n frequency.

Once the design variable values are known, the objective function and constraints can be computed at this design point. This is done for multiple points along a given curve with many curves overlaid on the design space as shown in Figure 1. This effectively blankets the design space to reveal its behavior to the designer. All the objective function values are maintained and the minimum of all computed points is determined. Other values within a prescribed tolerance of this minimal value are then computed. These are determined by their objective function and design variable values being different enough that they are considered regions where another local or global solution might exist. This tolerance value is left up to the designer. This tolerance is effectively a crude method of clustering to determine the sub-region centers.

Now, that preliminary points of possible minima have been identified, regions around each of these points are created. After obtaining the values of design variables during the initial phase, upper and lower bounds are calculated by adding a prescribed percentage from the design variable to create sub-regions. These regions will then be investigated with new curves as done previously for the entire design space. Each of the sub-regions is blanketed with curves and examined for a new minimal point. At each of the subsequent iterations, the sub-region is reduced by a prescribed amount. This procedure continues until the objective function value converges. The minimum of all the sub-regions is taken as the global optimum point. However, if the objective function value does not converge in a sub-region then that sub-region is discarded. Further, the number of points on a curve can also be altered in the case where the objective function is highly nonlinear.

The algorithm has some inherent advantages to it. First, it is a zero order method, so no derivatives need to be

Page 5: [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization - Atlanta, Georgia ()] 9th AIAA/ISSMO Symposium on Multidisciplinary

American Institute of Aeronautics and Astronautics5

calculated and it can also work on functions that are not continuous. Second, using analytical curves to examine the design space directly increases the ease of computation. Third, the designer has the ability to alter the amplitude and frequency of the curves as well as the number of design points along each curve that are evaluated. This provides control over how thoroughly the design space or sub-regions are examined. A flowchart of the proposed algorithm is given in Figure 3.

Figure 3: Analytical curve based method flowchart

Detailed Test CaseThe best way to explain the algorithm is through an application to a multimodal optimization problem. The first example investigated was the Himmelblau Function20. The optimization problem statement is expressed as follows:

Minimize

( ) ( )2221

2

22121 7XX11XX)X,f(X −++−+=

where1,2i6,X6 i =≤≤−

(3)

This problem has four local minima, all of which are global. The actual local minima are listed in Table 1.

Table 1: Local minima for Himmelblau functionLocalMin.

X1* X2

* f*

1 3.00000 2.00000 0.0

2 -3.77931 -3.28319 0.0

3 3.58443 -1.84813 0.0

4 -2.80512 3.13131 0.0

Using the analytical curve based approach (ACBM), all local minima as well as the global minima can be obtained. First, 80 pairs of points were generated along the boundary of the design space. These points were equally spaced along the boundary. Then a sinusoidal curve was created between each pair of points. Along the curves 410 points were computed using Equation 2. In the next step the code removed all those points, which were repeated while computing the points along each curve. At each of these points the objective function is computed and ordered from highest to lowest. A tolerance of 30 percent from the lowest objective function value was used to obtain three additional points. Hence, four points were obtained as shown in Figure 4. After getting these points, a specific region around these points is computed as shown in Figure 5. A specified percentage is computed of the four obtained points and is added and subtracted from them to bound the regions, which for this case was 20 percent. These regions were further investigated using additional analytical curves and investigated independently until the objective function converged. If a region did not converge it would have been discarded.

Figure 4: Local minima obtained after initial investigation

Page 6: [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization - Atlanta, Georgia ()] 9th AIAA/ISSMO Symposium on Multidisciplinary

American Institute of Aeronautics and Astronautics6

Figure 5: Reduced design space represented by small boxes

Figure 6: Objective function contours

Figure 6 shows the identified sub-regions inside a plot of the Himmelblau function. The final optimal points obtained are also shown in Figure 6 as circles. The optimum values obtained by the use of the proposed method are given in Table 2, and correspond very well to those given in Table 1. The optimum for the Himmelblau function is shown in Figure 7. With the use of the proposed algorithm, all four local minima were found on a single optimization run.

This problem was then solved using a brute force algorithm by means of a computational grid over the 2-D Design space. The problem was also solved using the Design Optimization Toolkit (DOT)21 specifically using the Fletcher-Reeves method and These results are presented in Tables 3 and 4.

Table 2: Results for Himmelblau function when amplitude is 4 and frequency is 35

Optimum values

Subregion

X1 X2Obj.

Func.

Func.Evals.

1 3.032 1.988 0.0819

2 -3.818 -3.296 0.0297

3 3.575 -1.803 0.035

4 -2.837 3.131 0.0349

850

Table 3: Results for Himmelblau function from DOT.

Initial Final

X1 X2 X1 X2

Obj.Func.Value

Func.Evals.

1 1 2.99 2.01 1.82E-03 32

1 1.5 2.99 2.00 7.15E-05 56

1 2.5 2.99 2.00 2.17E-08 63

-1 1 -2.80 3.13 1.15E-04 38

3 -1 3.58 -1.84 2.66E-05 32

0.2 0.2 2.99 2.00 2.36E-04 34

0 0 2.99 2.00 1.27E-04 44

-2 -1.5 -3.78 -3.28 1.63E-04 38

-0.2 -0.2 2.99 2.00 3.17E-04 52

Table 4: Results for Himmelblau function from grid-based search.

GridSize

x1 x2 FFunc.Evals.

0.1 3 2 0 14641

0.05 3 2 0 58081

0.01 3 2 0 1442401

0.005 3 2 0 5764801

0.001 3 2 0 144000000

Comparing tables 3, 4, and 5 some interesting conclusions can be drawn. First, the ACBM performed well it terms of computation. It took 850 function evaluations to obtain all solutions to the problem, while taking over 14,000 from a grid-based search with a grid size of only 0.1, and this only yielded the global solution. Using DOT from nine different initial points, all minima were obtained in 389 function evaluations. However, it is important to note that in finding the solution using a gradient-based algorithm, a designer does not know a priori how many solutions a problem

Page 7: [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization - Atlanta, Georgia ()] 9th AIAA/ISSMO Symposium on Multidisciplinary

American Institute of Aeronautics and Astronautics7

may have. There is no methodology present to provide a designer knowledge as to how many solution points exist. As the number of solution points increases, so will the number of initial points from which a solution should be started. Using the ACBM, the solution still only needs to be run once. A designer can control how fine a design space is searched and thus have confidence that all local minima, as well as the global, will be obtained.

Figure 7: Optimal solutions coincides with the calculated optima

Additional Test ProblemsThe next test case investigated was the three hump camel back function4. The optimization problem statement is expressed as follows:

Minimize:

2221

61

41

2121

6/

05.1*2),(

XXXX

XXXXf

+∗+

+∗−=

Subject to: 5.25.2 ≤≤− iX , 2,1=i

(4)

The two-dimensional design space for the three hump camel back function is shown in Figure 8. The problem consists of two design variables X1 and X2 and three local minima, one of which is global. The three minima (two local and the global) are given in Table 54.

Table 5: Local Minima for three hump camel back function

LocalMinima

X*1, X

*2 F*

1 0, 0 0.00

2 -1.75,0.85 0.2992

3 1.75,0-.85 0.2992

Figure 8: 2-D Plot of three hump camel back function

According to the proposed algorithm, the solution begins by placing sinusoidal curves over the entire design space. The number of sinusoidal curves was specified to be 20 with 16 design points computed on each curve.

Figure 9 shows some of the sinusoidal curves placed in the design space. All the curves are not shown in the figure for clarity. After obtaining the sinusoidal curves the values of design variables at various points along each curve were computed.

Figure 9: Sinusoidal curves for initial investigation

Regions are determined by their objective function and design variable values being sufficiently different from the current minimal point so that they are considered regions where another solution might exist. The designer controls this difference. This is done to ensure that the same region is not investigated multiple times, while also providing the sub-regions, which may contain local or global minima. In the three hump camel back function three sub-regions were identified. These are shown in Figure 10. The three identified regions were found by taking 20% of the lowest point obtained

Page 8: [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization - Atlanta, Georgia ()] 9th AIAA/ISSMO Symposium on Multidisciplinary

American Institute of Aeronautics and Astronautics8

after the initial investigation of the design space. These areas around the preliminary optimal points were then examined with new analytical curves, independently. Each region was examined and subsequently reduced until the objective function converged or the region was eliminated from consideration.

Figure 11 shows the final identified sub-regions inside a plot of the three hump camel back function. The numerical results are displayed in Table 6.

Figure 10: Points and Sub-Regions identified for three hump camel back function

Figure 11: Solution obtained using Analytical Curve Based Method

As can be seen from Table 6, the final points are within less than 2% of the published results. All these minima were identified with a single optimization run.

The solution was also obtained using DOT and a grid based search as was done for the previous example problem. These results are given in Tables 7 and 8. Similar conclusions to the first test case can be drawn looking at Tables 6, 7, and 8. All solutions are found running the ACBM once, whereas multiple initial

points have to be used in DOT. Using a typical grid based search, only the global solution is found and it is very computationally expensive. The ACBM performs well computationally while locating all minima at the same time.

Table 6: Results for three hump camel back function

ACBM SolutionsAmplitude: 1.0Frequency: 8.0

OptimumSolutions

Sub-Regn.

X1

X2F

X1*

X2* F

%ErrorOrVar.

10.030.04

.004500

0 0.0

21.77-0.83

.30471.75-.85

.2992 1.8

3-1.740.84

.2998-1.75.85

.2992 0.2

Table 7: Solution for three hump camel back function from DOT.

Initial Final

X1 X2 X1 X2

Obj.Func.Value

Func.Evals.

1 1 -2.96E-04 3.86E-04 2.10E-07 32

1.5 1.5 1.74E+00 -8.74E-01 2.98E-01 58

2 2 5.97E-05 7.60E-06 7.67E-09 32

1.5 2 1.74E+00 -8.73E-01 2.98E-01 45

1 0.5 8.84E-04 -5.88E-04 1.38E-06 32

-2 0.6 -1.74E+00 8.71E-01 2.98E-01 33

0.8 0.5 -3.23E-05 4.73E-05 2.80E-09 31

Table 8: Results for three hump camel back function from grid based search.

Grid Size x1 x2 FFunc.Evals.

0.1 0 0 0 2500

0.05 0 0 0 10201

0.01 0 0 0 251001

0.005 0 0 0 1002001

0.001 0 0 0 25010001

Page 9: [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization - Atlanta, Georgia ()] 9th AIAA/ISSMO Symposium on Multidisciplinary

American Institute of Aeronautics and Astronautics9

Effect of Amplitude and FrequencyAmplitude and frequency plays a major role in the proposed method. In the case of the three hump camel back function, amplitude and frequency were altered on various runs to analyze their effects. These values give a designer control over how the design space is searched. The results of these are shown in Tables 9, and 10.

Function Evaluations in the Initial Design space to get Sub-Regions: 160Number of Sub-Regions found: 3Convergence criteria: 0.001

Table 9: Solution runs for three hump camel back function with varying amplitude

ACBM SolutionsA=0.5 A=1.0 A=2.0

Sub-

Reg

ions X1

X2F

X1

X2F

X1

X2F

Max.%ErrorOrVar.

10.040.02

.0040.030.04

.0040.040.06

.009 0.9

21.70-.79

.3101.77-0.83

.3041.76-0.84

.301 3.6

3-1.680.81

.313-1.740.84

.299-1.710.79

.310 4.6

Table 10: Solution runs for three hump camel back function with varying frequency

ACBM SolutionsA=0.5 A=1.0 A=2.0

Sub-

Reg

ions X1

X2F

X1

X2F

X1

X2F

Max.%ErrorOrVar.

10.040.02

.0040.030.04

.0040.040.02

0.004 0.4

21.69-.77

.3111.77-.83

.3041.73-.85

0.302 3.9

3-1.780.81

.307-1.740.84

.299-1.75.85

0.299 2.6

In this test case, amplitude and frequency do not have a large effect on the solution quality as shown in Tables 9 and 10. No solution, whether local or global, is more than 5% different from the published solutions. This cab be attributed to the three hump camel back function being a relatively smooth, continuous function. It can be concluded that a relatively simple investigation with analytical curves yields all the solutions as is seen here. Further, it became apparent during this investigation that the values of amplitude and frequency must be directly related to the number of points per curve that

are computed as well as the size of the sub-regions chosen.

ConclusionsA proposed optimization algorithm has been developed that uses analytical curves to blanket and examine a given design space. Regions where local optima may exist are identified. These regions are then iteratively reduced to a prescribed tolerance. The best point inside each region is identified as a local or global solution or the region is eliminated.

This method provides a designer with control over how thorough a design space is searched. This method was developed to provide all minima (local and global) of an optimization problem. Currently, the algorithm was tested on several 2-D unconstrained example problems, two of which were presented in this paper.

Future work includes extending this algorithm to n-D constrained problems and more thorough testing of the designer parameters to determine their impact on solution efficiency, accuracy, and quality.

AcknowledgementsThe authors wish to acknowledge the generous support of this work by the New York State Center for Engineering Design and Industrial Innovation (NYSCEDII) at the University at Buffalo.

References [1] Edgar, T. F., and Himmelblau, D. M., (1988), “ Optimization of Chemical Processes”, McGraw-Hill, NY.

[2] Juvinall, J. E., and Marshek, K. M., (1991), “Fundamentals of Machine Component Design”, Willey, New York.

[3] Hernandez, S., (1997), “Application of design optimization in steel and concrete industry”, Optimization in Industry, ASME.

[4] Dixon, L.C.W. and Szego, G.P. (eds) (1975), Towards Global Optimization II, North Holland, Amsterdam.

[5] Dordrecht, B., “Journal of global optimization”, Kluwer Academic Publishers.

[6] Deb, K., (1998), “Genetic Algorithms in Search and Optimization: The Technique and Applications. Proceedings of International Workshop on Soft Computing and Intelligent Systems, Calcutta, India.

Page 10: [American Institute of Aeronautics and Astronautics 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization - Atlanta, Georgia ()] 9th AIAA/ISSMO Symposium on Multidisciplinary

American Institute of Aeronautics and Astronautics10

[7] Vanderplaats, G.N., (1984), “Numerical Optimization Techniques for Engineering Design with Applications”, McGraw-Hill Inc., New York, New York.

[8] Rao, S. Singiresu, (1998), “ Engineering optimization -Theory and Practice”, Third Edition, New Age International (P) Ltd., New Delhi.

[9] Davidon, W.C., (1959): Variable Metric Method for Minimization, Argone National Laboratory, ANL-5990 Rev., University of Chicago.

[10] Goldfarb, D., (1970), A Family of Variable Metric Methods Derived by Variational Means, Math. Comput. Vol. 24.

[11] Horst, R and Tuy, H. (1990), Global Optimization – Deterministic Approaches, Springer, Berlin.

[12] Lucidi, S. and Piccioni, M. (1989), “Random Tunneling By Means of Acceptance-Rejection Sampling for Global Optimization,” JOTA, Vol. 62, No. 2.

[13] Locatelli, M. (1998), “Relaxing the assumptions of the Multilevel Single Linkage Algorithm,” Journal of Global Optimization 13.

[14] Price, W.L. (1978), “ A Controlled Random Search Procedure for Global Optimization,” in Dixon, L.C.W., Szego, G.P. (eds.) Towards Global Optimization II, North Holland, Amsterdam.

[15] Kirkparick, S., Gelatt, C.D. and Vecchi, M.P., (1983), “Optimization by Simulated Annealing,” Science, Vol.220.

[16] Metropolis, N., Rosenbluth, A., Rosenbluth, M., Teller, A., and Teller, E., (1953), Equation of State Calculations by Fast Computing Machines, J. of Chem. Phy, 21.

[17] Holland, J.A., (1975), “Adaptation n Natural and Artificial Systems”, University f Michigan, Ann Arbor.

[18] Hajela, P. and Lin, C.Y., (1992), “Genetic Search Strategies in Multicriterion Optimal Design”, Structural Optimization, Number 4.

[19] Borup, L. and Parkinson, A. (1992), “Comparison of Four Non-Derivative Optimization Methods on Two Problems Containing Heuristic and Analytical Knowledge,” Advances in Design Automation, ASME DE-Vol. 44-1.

[20] Reklaitis, G.V., Ravindran, A., Ragsdell, K.M., (1983), “Engineering Optimization- Methods and Applications”, John Wiley and Sons, New York, New York.

[21] DOT— Design Optimization Tools, DOT Users Manual, Vanderplaats Research & Development, Inc., 1995.


Recommended