Date post: | 04-Jun-2018 |
Category: |
Documents |
Upload: | spin-fotonio |
View: | 215 times |
Download: | 0 times |
of 70
8/13/2019 Artificial Intelligence Search
1/70
Artificial Intelligence Search Algorithms
Dr Rong QuSchool of Computer Science
University of NottinghamNottingham, NG8 1BB, UK
Local Search Algorithms
8/13/2019 Artificial Intelligence Search
2/70
Konstanz, May 2012 AI Search Algorithms Local Search 2
Optimisation ProblemsFor most of real world optimisation problems
An exact model cannot be built easily;
Number of feasible solutions grow exponentiallywith growth in the size of the problem.Optimisation algorithms
Mathematical programmingTree searchHeuristic algorithms
8/13/2019 Artificial Intelligence Search
3/70
8/13/2019 Artificial Intelligence Search
4/70
Konstanz, May 2012 AI Search Algorithms Local Search 4
Optimisation Problems : MethodsMeta-heuristics
Guide an underlying heuristic/search to escapefrom being trapped in a local optima and toexplore better areas of the solution spaceExamples:
Single solution approaches: Simulated Annealing, Tabu Search, etc;Population based approaches: Geneticalgorithm, Memetic algorithm, Ant Algorithms,etc;
8/13/2019 Artificial Intelligence Search
5/70
Konstanz, May 2012 AI Search Algorithms Local Search 5
Optimisation Problems : MethodsMeta-heuristics
+ : Able to cope with inaccuracies of data andmodel, large sizes of the problem and real-timeproblem solving;+ : mechanisms to escape from local optima+ : ease of implementation;+ : no need for exact model of the problem;-: usually no guarantee of optimality.
8/13/2019 Artificial Intelligence Search
6/70
LOCAL SEARCH
Konstanz, May 2012 AI Search Algorithms Local Search 6
8/13/2019 Artificial Intelligence Search
7/70Konstanz, May 2012 AI Search Algorithms Local Search 7
Starts from some initial solution , moves to a betterneighbour solution until it arrives at a local optimum (does not have a better neighbour )
Examples: k-opt algorithm for TSP, etc;+: ease of implementation;+: guarantee of local optimality usually in smallcomputational time;+: no need for exact model of the problem;
-: poor quality of solution due to getting stuck in poorlocal optima;
Local Search: Method
8/13/2019 Artificial Intelligence Search
8/70Konstanz, May 2012 AI Search Algorithms Local Search 8
f(X)
X
local maximum
solution
global maximum
solution
( )
Neighbourhood ofsolution
globalmaximum value
Y
Local Search: terminology
8/13/2019 Artificial Intelligence Search
9/70Konstanz, May 2012 AI Search Algorithms Local Search 9
Neighbourhood function : using the concept of amove, which changes one or more attributes of agiven solution to generate another solution
Local optimum : a solution x with respect to theneighbourhood function N , if f(x) < f(y) for every y in N(x)
Local Search: terminology
8/13/2019 Artificial Intelligence Search
10/70
Konstanz, May 2012 AI Search Algorithms Local Search 10
Local Search: elements
Representation of the solution
Evaluation function
Neighbourhood functionSolutions which are close to a given solutionOptimisation of real-valued functions, for a current solutionx0, neighbourhood is defined as an interval (x0 r, x0 +r)
Acceptance criterionFirst improvement, best improvement, best of non-improvingsolutions, random criteria
8/13/2019 Artificial Intelligence Search
11/70
Konstanz, May 2012 AI Search Algorithms Local Search 11
1. Pick a random point in the search space2. Consider all the neighbours of the current state
3. Choose the neighbour with the best quality andmove to that state4. Repeat 2 thru 4 until all the neighbouring states are
of lower quality
5. Return the current state as the solution state
Local Search: Hill Climbing
8/13/2019 Artificial Intelligence Search
12/70
Konstanz, May 2012 AI Search Algorithms Local Search 12
Local Search: Hill Climbing
Possible solutions- Try several runs, starting atdifferent positions- Increase the size of the
neighborhood (e.g. 3-opt inTSP)
8/13/2019 Artificial Intelligence Search
13/70
Konstanz, May 2012 AI Search Algorithms Local Search 13
How can bad local optima be avoided?
8/13/2019 Artificial Intelligence Search
14/70
SIMULATED ANNEALING
Konstanz, May 2012 AI Search Algorithms Local Search 14
Motivated by the physical annealing processMaterial is heated and slowly cooled into a uniform structure
8/13/2019 Artificial Intelligence Search
15/70
Konstanz, May 2012 AI Search Algorithms Local Search 15
The SA algorithmThe first SA algorithm was developed in 1953(Metropolis)Kirkpatrick (1982)* applied SA to optimisationproblems
Compared to hill climbingSA allows downwards steps
A SA move is selected at random and then decides whetherto accept it
Better moves are always acceptedWorse moves may be accepted, depends on aprobability
*Kirkpatrick, S , Gelatt, C.D., Vecchi, M.P. 1983. Optimization bySimulated Annealing. Science, vol 220, No. 4598, pp 671-680
8/13/2019 Artificial Intelligence Search
16/70
Konstanz, May 2012 AI Search Algorithms Local Search 16
To accept or not to accept?
The law of thermodynamics states that attemperature t , the probability of an increase inenergy of magnitude, E , is given by
P (E ) = exp(- E / kt )
k is a constant known as Boltzmanns constant
8/13/2019 Artificial Intelligence Search
17/70
Konstanz, May 2012 AI Search Algorithms Local Search 17
To accept or not to accept?
P = exp(- c / t ) > r
c is change in the evaluation function
t the current temperaturer is a random number between 0 and 1
Change inEvaluationFunction
Temperatureof System exp(-C/T)
Change inEvaluationFunction
Temperatureof System exp(-C/T)
10 100 0.904837418 10 10 0.36787944120 100 0.818730753 20 10 0.13533528330 100 0.740818221 30 10 0.04978706840 100 0.670320046 40 10 0.01831563950 100 0.60653066 50 10 0.00673794760 100 0.548811636 60 10 0.00247875270 100 0.496585304 70 10 0.00091188280 100 0.449328964 80 10 0.00033546390 100 0.40656966 90 10 0.00012341
100 100 0.367879441 100 10 4.53999E-05110 100 0.332871084 110 10 1.67017E-05120 100 0.301194212 120 10 6.14421E-06130 100 0.272531793 130 10 2.26033E-06140 100 0.246596964 140 10 8.31529E-07150 100 0.22313016 150 10 3.05902E-07160 100 0.201896518 160 10 1.12535E-07170 100 0.182683524 170 10 4.13994E-08180 100 0.165298888 180 10 1.523E-08190 100 0.149568619 190 10 5.6028E-09200 100 0.135335283 200 10 2.06115E-09
.0.50.60.70.80.9
1
o f A c c e p
t a n c e
SImulated Annealing Acceptance Probability
Temp = 100
Tem = 10
* Need to use a scientific calculator to calculate exp()
8/13/2019 Artificial Intelligence Search
18/70
8/13/2019 Artificial Intelligence Search
19/70
Konstanz, May 2012 AI Search Algorithms Local Search 19
The SA algorithm
For t = 1 to do T = Schedule [t]If T = 0 then return Current Next = a randomly selected neighbour of Current
E = VALUE[Next ] VALUE[Current ]if E > 0 then Current = Next
else Current = Next with probability exp(- E/T)
8/13/2019 Artificial Intelligence Search
20/70
Konstanz, May 2012 AI Search Algorithms Local Search 20
The SA algorithm
To implement an SA algorithm: implement hillclimbing with an accept function and modify theacceptance criteria
The cooling schedule is hidden in this algorithm -but it is important (more later)
The algorithm assumes that annealing will continueuntil temperature is zero - this is not necessarily thecase
8/13/2019 Artificial Intelligence Search
21/70
Konstanz, May 2012 AI Search Algorithms Local Search 21
SA - Cooling Schedule
Starting Temperature
Final Temperature
Temperature Decrement
Iterations at each temperature
8/13/2019 Artificial Intelligence Search
22/70
Konstanz, May 2012 AI Search Algorithms Local Search 22
SA - Cooling Schedule
Starting TemperatureMust be hot enough to allow moves to almost neighbourhood state (else we are in danger ofimplementing hill climbing)Must not be so hot that we conduct a randomsearch for a period of timeProblem is finding a suitable startingtemperature
8/13/2019 Artificial Intelligence Search
23/70
Konstanz, May 2012 AI Search Algorithms Local Search 23
SA - Cooling Schedule
Starting TemperatureIf we know the maximum change in the costfunction we can use this to estimateStart high, reduce quickly until about 60% ofworse moves are accepted. Use this as thestarting temperatureHeat rapidly until a certain percentage areaccepted the start cooling
8/13/2019 Artificial Intelligence Search
24/70
Konstanz, May 2012 AI Search Algorithms Local Search 24
SA - Cooling Schedule
Final TemperatureUsual decrease temperature to 0However, the algorithm runs for a lot longerIn practise, it is not necessary to decrease thetemperature to 0Chances of accepting a worse move are almostthe same as the temperature being equal to 0Therefore, the stopping criteria can either be asuitably low temperature or when the system is
frozen at the current temperature (i.e. nobetter or worse moves are being accepted)
8/13/2019 Artificial Intelligence Search
25/70
8/13/2019 Artificial Intelligence Search
26/70
Konstanz, May 2012 AI Search Algorithms Local Search 26
SA - Cooling Schedule
Temperature DecrementLinear
temp = temp x
Geometrictemp = temp * aExperience has shown that should be between 0.8and 0.99. Of course, the higher the value of , thelonger it will take
8/13/2019 Artificial Intelligence Search
27/70
Konstanz, May 2012 AI Search Algorithms Local Search 27
SA - Cooling Schedule
Iterations at each temperature A constant number of iterations at eachtemperature
Another method, first suggested by (Lundy,1986) is to only do one iteration at eachtemperature, but to decrease the temperaturevery slowly
t = t/(1 + t)
where is a suitably small value
Change inEvaluationFunction
Temperatureof System exp(-C/T)
Change inEvaluationFunction
Temperatureof System e xp( -C/T)
10 100 0.904837418 10 10 0.36787944120 100 0.818730753 20 10 0.135335283
30 100 0.740818221 30 10 0.04978706840 100 0.670320046 40 10 0.01831563950 100 0.60653066 50 10 0.00673794760 100 0.548811636 60 10 0.00247875270 100 0.496585304 70 10 0.00091188280 100 0.449328964 80 10 0.00033546390 100 0.40656966 90 10 0.00012341
100 100 0.367879441 100 10 4.53999E-05110 100 0.332871084 110 10 1.67017E-05120 100 0.301194212 120 10 6.14421E-06130 100 0.272531793 130 10 2.26033E-06140 100 0.246596964 140 10 8.31529E-07150 100 0.22313016 150 10 3.05902E-07160 100 0.201896518 160 10 1.12535E-07170 100 0.182683524 170 10 4.13994E-08180 100 0.165298888 180 10 1.523E-08190 100 0.149568619 190 10 5.6028E-09200 100 0.135335283 200 10 2.06115E-09
.0.5
0.60.70.80.9
1
o f
A c c e p
t a n c e
SImulated Annealing Acceptance Proba bility
Temp = 100
Tem = 10
8/13/2019 Artificial Intelligence Search
28/70
Konstanz, May 2012 AI Search Algorithms Local Search 28
SA - Cooling Schedule
Iterations at each temperature An alternative: dynamically change the numberof iterations as the algorithm progresses
At lower temperatures: a large number of iterationsare done so that the local optimum can be fully explore At higher temperatures: the number of iterations canbe less
8/13/2019 Artificial Intelligence Search
29/70
TABU SEARCH
Konstanz, May 2012 AI Search Algorithms Local Search 29
a meta -heuristic superimposed on another heuristic. The overallapproach is to avoid entrapment in cycles by forbidding or penalizingmoves which take the solution, in the next iteration, to points in thesolution space previously visited (hence tabu ).
Proposed independently by Glover (1986) and Hansen (1986)
8/13/2019 Artificial Intelligence Search
30/70
Konstanz, May 2012 AI Search Algorithms Local Search 30
The TS algorithm Accepts non-improving solutions deterministically inorder to escape from local optima by guiding asteepest descent local search (or steepest ascent hillclimbing) algorithm
After evaluating a number of neighbourhoods, weaccept the best one, even if it is low quality on costfunction.
Accept worse move
8/13/2019 Artificial Intelligence Search
31/70
Konstanz, May 2012 AI Search Algorithms Local Search 31
The TS algorithmUses of past experiences (memory) to improvecurrent decision making in two ways
prevent the search from revisiting previouslyvisited solutionsexplore the unvisited areas of the solution space
By using memory (a tabu list) to prohibit certainmoves
makes tabu search a global optimizer rather thana local optimizer
8/13/2019 Artificial Intelligence Search
32/70
8/13/2019 Artificial Intelligence Search
33/70
Konstanz, May 2012 AI Search Algorithms Local Search 33
Tabu Search - uses of memoryTabu move what does it mean?
Not allowed to re-visit exactly the same state thatweve been before
Discouraging some patterns in solution: e.g. inTSP problem, tabu a state that has the townslisted in the same order that weve seen before. If the size of problem is large, lot of time justchecking if weve been to certain state before.
8/13/2019 Artificial Intelligence Search
34/70
8/13/2019 Artificial Intelligence Search
35/70
Konstanz, May 2012 AI Search Algorithms Local Search 35
Tabu Search algorithmCurrent = initial solutionWhile not terminate
Next = a highest-valued neighbour of Current
If(not Move_Tabu( H, Next ) or Aspiration( Next )) then Current = Next Update BestSolutionSeen H = Recency( H + Current)
Endif
End-While Return BestSolutionSeen
8/13/2019 Artificial Intelligence Search
36/70
Konstanz, May 2012 AI Search Algorithms Local Search 36
Elements of Tabu Search
Memory related - recency (How recent thesolution has been reached)
Tabu List (short term memory): to record a limitednumber of attributes of solutions (moves,selections, assignments, etc) to be discouraged inorder to prevent revisiting a visited solution;Tabu tenure (length of tabu list): number ofiterations a tabu move is considered to remain
tabu;
8/13/2019 Artificial Intelligence Search
37/70
Konstanz, May 2012 AI Search Algorithms Local Search 37
Elements of Tabu Search
Memory related - recency (How recent thesolution has been reached)
Tabu tenureList of moves does not grow forever restrictthe search too muchRestrict the size of listFIFOOther ways: dynamic
8/13/2019 Artificial Intelligence Search
38/70
Konstanz, May 2012 AI Search Algorithms Local Search 38
Elements of Tabu Search
Memory related frequencyLong term memory: to record attributes of elitesolutions to be used in:
Diversification: Discouraging attributes of elitesolutions in selection functions in order todiversify the search to other areas of solutionspace;Intensification: giving priority to attributes of a
set of elite solutions (usually in weightedprobability manner)
8/13/2019 Artificial Intelligence Search
39/70
Konstanz, May 2012 AI Search Algorithms Local Search 39
Elements of Tabu Search
If a move is good, but its tabu -ed, do we still rejectit?
Aspiration criteria: accepting an improving solutioneven if generated by a tabu move
Similar to SA in always accepting improvingsolutions, but accepting non-improving ones whenthere is no improving solution in theneighbourhood;
8/13/2019 Artificial Intelligence Search
40/70
Konstanz, May 2012 AI Search Algorithms Local Search 40
Example: TSP using Tabu Search
Find the list of towns to be visited so that thetravelling salesman will have the shortestroute
Short term memory:Maintain a list of t towns and prevent them frombeing selected for consideration of moves for anumber of iterations;
After a number of iterations, release those townsby FIFO
8/13/2019 Artificial Intelligence Search
41/70
Konstanz, May 2012 AI Search Algorithms Local Search 41
Example: TSP using Tabu Search
Long term memory:Maintain a list of t towns which have beenconsidered in the last k best (worst) solutions
encourage (or discourage) their selections infuture solutionsusing their frequency of appearance in the set ofelite solutions and the quality of solutions whichthey have appeared in our selection function
8/13/2019 Artificial Intelligence Search
42/70
Konstanz, May 2012 AI Search Algorithms Local Search 42
Example: TSP using Tabu Search
Aspiration:If the next moves consider those moves in thetabu list but generate better solution than thecurrent one
Accept that solution anywayPut it into tabu list
8/13/2019 Artificial Intelligence Search
43/70
Konstanz, May 2012 AI Search Algorithms Local Search 43
Tabu Search Pros & ConsPros
Generated generally good solutions foroptimisation problems compared with other AI
methodsConsTabu list construction is problem specificNo guarantee of global optimal solutions
8/13/2019 Artificial Intelligence Search
44/70
Konstanz, May 2012 AI Search Algorithms Local Search 44
Other Local Search
Variable Neighbourhood SearchIterative Local SearchGuided Local SearchGRASP (Greedy Random Adaptive Search Procedure)
Talbi, Metaheuristics From design toimplementation , Wiley, 2009
8/13/2019 Artificial Intelligence Search
45/70
8/13/2019 Artificial Intelligence Search
46/70
Konstanz, May 2012 AI Search Algorithms Local Search 46
Cost Function
The evaluation function is calculated at everyiteration
Often the cost function is the most expensive partof the algorithm
Therefore
We need to evaluate the cost function as efficiently aspossibleUse Delta EvaluationUse Partial Evaluation
8/13/2019 Artificial Intelligence Search
47/70
Konstanz, May 2012 AI Search Algorithms Local Search 47
Cost Function
If possible, the cost function should also bedesigned so that it can lead the search
Avoid cost functions where many states returnthe same valueThis can be seen as a plateau in the searchspace, the search has no knowledge where itshould proceedBin Packing
8/13/2019 Artificial Intelligence Search
48/70
Konstanz, May 2012 AI Search Algorithms Local Search 48
Cost Function
Bin Packing A number of items, a number of binsObjective
As many items as possible As less bins as possible Other objectives depending on the problems
Cost function? a) number of bins
b) number of items c) both a) and b)How about there are weights for the items?
8/13/2019 Artificial Intelligence Search
49/70
Konstanz, May 2012 AI Search Algorithms Local Search 49
Cost Function
Graph Colouring A undirected graph G =(V, E), V: vertices; E:edges connecting vertices
Objective colouring the graph with the minimal number ofcolours so that no adjacent vertices are of the same colour
Cost function? a) number of colours
How about different colourings (during the search)of the same number of colours?
WA
NT
SA
Q
NSW
V
T
8/13/2019 Artificial Intelligence Search
50/70
8/13/2019 Artificial Intelligence Search
51/70
Konstanz, May 2012 AI Search Algorithms Local Search 51
Cost Function
WeightingsHard constraints: a large weighting. Thesolutions which violate those constraints have a
high cost functionSoft constraints: weighted depending ontheir importanceCan be dynamically changed as the algorithmprogresses. This allows hard constraints to beaccepted at the start of the algorithm butrejected later
8/13/2019 Artificial Intelligence Search
52/70
Konstanz, May 2012 AI Search Algorithms Local Search 52
Neighbourhood
How do you move from one state to another?When you are in a certain state, what other statesare reachable?
Examples: bin packing, timetablingSome researchthe neighbourhood structure should be symmetric. i.e. ifmove from state i to state j , then possible to move fromstate j to state i
However, a weaker condition can hold in order toensure convergenceEvery state must be reachable from every other.
Important: when thinking about your problem, ensurethat this condition is met
8/13/2019 Artificial Intelligence Search
53/70
Konstanz, May 2012 AI Search Algorithms Local Search 53
Neighbourhood
The smaller the search space, the easier the searchwill beIf we define cost function such that infeasible
solutions are accepted, the search space will beincreased As well as keeping the search space small, alsokeep the neighbourhood small
8/13/2019 Artificial Intelligence Search
54/70
Konstanz, May 2012 AI Search Algorithms Local Search 54
Performance
What is performance? Quality of the solution returnedTime taken by the algorithm
We already have the problem of findingsuitable SA parameters (cooling schedule)
8/13/2019 Artificial Intelligence Search
55/70
Konstanz, May 2012 AI Search Algorithms Local Search 55
Performance
Improving Performance - Initialisation Start with a random solution and let theannealing process improve on that.Might be better to start with a solution thathas been heuristically built (e.g. for the TSPproblem, start with a greedy search)
8/13/2019 Artificial Intelligence Search
56/70
8/13/2019 Artificial Intelligence Search
57/70
APPENDIX
Konstanz, May 2012 AI Search Algorithms Local Search 57
SA modifications in the literature
8/13/2019 Artificial Intelligence Search
58/70
Acceptance Probability
The probability of accepting a worse movein SA is normally based on the physicalanalogy (based on the Boltzmann
distribution)
But is there any reason why a differentfunction will not perform better for all, or at
least certain, problems?
Konstanz, May 2012 58 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
59/70
Acceptance Probability
Why should we use a different acceptancecriteria?
The one proposed does not work. Or we suspect
we might be able to produce better solutionsThe exponential calculation is computationallyexpensive.Johnson (1991) found that the acceptancecalculation took about one third of the
computation time
Konstanz, May 2012 59 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
60/70
Acceptance Probability
Johnson experimented withP() = 1 / t
This approximates the exponentialPlease read in conjunction with the simulated annealing handout
Set these parametersClassic Acceptance Criteria Approximate Acceptance Criteria C ha nge i n E va lu at io n F unc ti on , c 2 0
exp(-c/t) 1 - c / t Temperature, t 1000.818730753 0.8
Konstanz, May 2012 60 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
61/70
Acceptance Probability
A better approach was found by building alook-up table of a set of values over therange / t
During the course of the algorithm / t wasrounded to the nearest integer and thisvalue was used to access the look-up tableThis method was found to speed up the
algorithm by about a third with nosignificant effect on solution quality
Konstanz, May 2012 61 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
62/70
The Cooling Schedule
If you plot a typical cooling schedule you are likelyto find that at high temperatures many solutionsare acceptedIf you start at too high a temperature a randomsearch is emulated and until the temperature coolssufficiently any solution can be reached and couldhave been used as a starting position
Konstanz, May 2012 62 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
63/70
The Cooling Schedule
At lower temperatures, a plot of the coolingschedule, is likely to show that very few worsemoves are accepted; almost making simulated
annealing emulate hill climbingTaking this one stage further, we can say thatsimulated annealing does most of its work duringthe middle stages of the cooling scheduleConnolly (1990) suggested annealing at a constanttemperature
Konstanz, May 2012 63 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
64/70
The Cooling Schedule
But what temperature?It must be high enough to allow movement but notso low that the system is frozen
But, the optimum temperature will vary from onetype of problem to another and also from oneinstance of a problem to another instance of thesame problem
Konstanz, May 2012 64 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
65/70
The Cooling Schedule
One solution to this problem is to spend some timesearching for the optimum temperature and thenstay at that temperature for the remainder of thealgorithmThe final temperature is chosen as the temperaturethat returns the best cost function during thesearch phase
Konstanz, May 2012 65 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
66/70
Neighbourhood
The neighbourhood of any move is normallythe same throughout the algorithm but
The neighbourhood could be changed asthe algorithm progresses
For example, a different neighbourhood can beused to helping jumping from local optimal
Konstanz, May 2012 66 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
67/70
Cost Function
The cost function is calculated at every iteration ofthe algorithm
Various researchers (e.g. Burke,1999) have shownthat the cost function can be responsible for a largeproportion of the execution time of the algorithmSome techniques have been suggested which aimto alleviate this problem
Konstanz, May 2012 67 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
68/70
Cost Function
Rana (1996) - Coors BreweryGA but could be applied to SAThe evaluation function is approximated (one
tenth of a second)Potentially good solution are fully evaluated(three minutes)
Konstanz, May 2012 68 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
69/70
Cost Function
Ross (1994) uses delta evaluation on thetimetabling problem
Instead of evaluating every timetable as only
small changes are being made between onetimetable and the next, it is possible to evaluate just the changes and update the previous costfunction using the result of that calculation
Konstanz, May 2012 69 AI Search Algorithms Local Search
8/13/2019 Artificial Intelligence Search
70/70
Cost Function
Burke (1999) uses a cacheThe cache stores cost functions (partial andcomplete) that have already been evaluatedThey can be retrieved from the cache ratherthan having to go through the evaluationfunction again