Date post: | 27-Jan-2015 |
Category: |
Documents |
Upload: | graham-kendall |
View: | 117 times |
Download: | 5 times |
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Hyper-heuristics: Past Present and Future
Graham Kendall
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Albert Einstein
1879 - 1955
ContentsPast
• A selection of early work
Present
• Current State of the Art
Future
• Potential Research Directions for the Future“We can't solve problems by using the
same kind of thinking we used when
we created them.”
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Albert Einstein
1879 - 1955
ContentsPast
• A selection of early work
Present
• Current State of the Art
Future
• Potential Research Directions for the Future“We can't solve problems by using the
same kind of thinking we used when
we created them.”
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Fisher H. and Thompson G.L. (1963) Probabilistic Learning
Combinations of Local Job-shop Scheduling Rules. In Muth J.F. and
Thompson G.L. (eds) Industrial Scheduling, Prentice Hall Inc., New
Jersey, 225-251
Based on (I assume)
Fisher H. and Thompson G.L. (1961) Probabilistic Learning
Combinations of Local Job-shop Scheduling Rules. In Factory
Scheduling Conference, Carnegie Institute of Technology
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Good
NumberFacility Order Matrix
1 3(1) 1(3) 2(6) 4(7) 6(3) 5(6)
2 2(8) 3(5) 5(10) 6(10) 1(10) 4(4)
3 3(5) 4(4) 6(8) 1(9) 2(1) 5(7)
4 2(5) 1(5) 3(5) 4(3) 5(8) 6(9)
5 3(9) 2(3) 5(5) 6(4) 1(3) 4(1)
6 2(3) 4(3) 6(9) 1(10) 5(4) 3(1)
6 x 6*6 Test Problem (times in brackets)
“The number of feasible active schedules is, by a conservative estimate, well over
a million, so their complete enumeration is out of the question.”
• Also 10 (jobs) x 10 (operations) and 20 (jobs)
x 5 (operations) problems
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Good
NumberFacility Order Matrix
1 3(1) 1(3) 2(6) 4(7) 6(3) 5(6)
2 2(8) 3(5) 5(10) 6(10) 1(10) 4(4)
3 3(5) 4(4) 6(8) 1(9) 2(1) 5(7)
4 2(5) 1(5) 3(5) 4(3) 5(8) 6(9)
5 3(9) 2(3) 5(5) 6(4) 1(3) 4(1)
6 2(3) 4(3) 6(9) 1(10) 5(4) 3(1)
6 x 6*6 Test Problem (times in brackets)
Job 3, 1, 2, 5, 4, 6
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
• Two Rules• SIO: Shortest Imminent Operation (“First on,
First Off”)
• LRT: Longest Remaining Time
• Only require knowledge of “your”
machine
Good
NumberFacility Order Matrix
1 3(1) 1(3) 2(6) 4(7) 6(3) 5(6)
2 2(8) 3(5) 5(10) 6(10) 1(10) 4(4)
3 3(5) 4(4) 6(8) 1(9) 2(1) 5(7)
4 2(5) 1(5) 3(5) 4(3) 5(8) 6(9)
5 3(9) 2(3) 5(5) 6(4) 1(3) 4(1)
6 2(3) 4(3) 6(9) 1(10) 5(4) 3(1)
6 x 6*6 Test Problem (times in brackets)
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
• Monte Carlo: 58 time Units
• SIO: 67 time units
• LRT: 61 time units
• Optimal: 55 time units
• SIO should be used initially (get the
machines to start work) and LRT later
(work on the longest jobs)
• Why not combine the two heuristics?
• Four learning models, rewarding good
heuristic selection
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
• Not sure about reproducibility (e.g.
reward/punishment functions)
• An unbiased random combination of
scheduling rules is better than any of them
taken separately
• “Learning is possible, but there is a question as
to whether learning is desirable given the
effectiveness of the random combination”
• “It is not clear what is being learnt as the
original conjecture was not strongly
supported”
• “It is likely that combinations of 5-10 rules
would out-perform humans”
Remarks
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Fang H-L., Ross P. and Corne D. (1993) A Promising genetic
Algorithm Approach to Job-Shop Scheduling, Reschecduling, and
Open-Shop Scheduling Problems. In Forrest S. (ed) Fifth International
Conference on Genetic Algorithms, Morgan Kaufmann, San Mateo,
375-383
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Representation
• For a j x m problem, a string represents j x m
chunks.
• The chunk is atomic from a GA perspective.
• The chunks abc means to put the first
untackled task of the ath uncompleted job into
the earliest place it will fit in the developing
schedule, then put the bth uncompleted job into
….
• A schedule builder decodes the chromosome.
• Fairly standard GA e.g. population size of 500,
rank based selection, elitism, 300 generations,
crossover rate 0.6, adaptive mutation rate
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Other Remarks
• Considered Job-Shop Scheduling and Open-
Shop Scheduling
• Experimented with different GA parameters
• Results compared favourably with best known
or optimal
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Denzinger J. and Fuchs M. (1997) High Performance ATP Systems by
Combining Several AI Methods. In proceedings of the Fifteenth
International Joint Conference on Artificial Intelligence (IJCAI 97),
102-107
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Remarks
• The first paper to use the term Hyper-heuristic
• Used in the context of an automated theorem
prover
• A hyper-heuristic stores all the information
necessary to reproduce a certain part of the
proof and is used instead of a single heuristic
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
O’Grady P.J. and Harrison (1985) A General Search Sequencing Rule
for Job Shop Sequencing. International Journal of Production
Research, 23(5), 961-973
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Remarks
Pi = (Ai x Ti) + (Bi x Si)
where
Pi the priority index for job i at its current stage
Ai a 1 x m coefficient vector for job i
Ti a m x 1 vector which contains the remaining
operation times for job i in process order
Bi the due date priority coefficient for job i
Si the due date slack for job i
m the maximum number of processing stages
for jobs 1 to i
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Remarks
A = (1,0,0,0,0,…,0), B = 0
Shortest Imminent Operation Time
A = (0,0,0,0,0,…,0), B = 1
Due Date Sequencing
Pi = (Ai x Ti) + (Bi x Si)
where
Pi the priority index for job i at its current stage
Ai a 1 x m coefficient vector for job i
Ti a m x 1 vector which contains the remaining operation
times for job i in process order
Bi the due date priority coefficient for job i
Si the due date slack for job i
m the maximum number of processing stages for jobs 1 to i
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Remarks
A search is performed over Ai and Bi in order to
cause changes in the processing sequences.
Pi = (Ai x Ti) + (Bi x Si)
where
Pi the priority index for job i at its current stage
Ai a 1 x m coefficient vector for job i
Ti a m x 1 vector which contains the remaining operation
times for job i in process order
Bi the due date priority coefficient for job I
Si the due date slack for job i
m the maximum number of processing stages for jobs 1 to i
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Norenkov I. P. and Goodman E D. (1997) Solving Scheduling
Problems via Evolutionary Methods for Rule Sequence Optimization.
In proceedings of the 2nd World Conference on Soft Computing
(WSC2)
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Remarks
• Similar in idea to Fang, Ross and Corne (1994)
• The allele at the ith position is the heuristic to
be applied at the ith step of the scheduling
process.
• Comparison with using eight single heuristics
and the Heuristic Combination Method (HCM)
was found to be superior.
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Other (Selected) Papers
• Crowston W.B., Glover F., Thompson G.L. and
Trawick J.D. (1963) Probabilistic and Parameter
Learning Combinations of Local Job Shop
Scheduling Rules. ONR Research Memorandum,
GSIA, Carnegie Mellon University
• Storer R.H., Wu S.D. and Vaccari R. (1992) New
Search Spaces for Sequencing Problems with
Application to Job Shop Scheduling. Management
Science, 38(10), 1495-1509
• Battiti R. (1996) Reactive Search: Toward Self
Tuning Heuristics. In Rayward-Smith R.J., Osman
I.H., Reeves C.R. and Smith G.D. (eds) Modern
Heuristics Search methods, John Wiley, 61-83
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Albert Einstein
1879 - 1955
ContentsPast
• A selection of early work
Present (Heuristics to Choose Heuristics)
• Current State of the Art
Future
• Potential Research Directions for the Future“We can't solve problems by using the
same kind of thinking we used when
we created them.”
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Domain Barrier
……
Set of low level heuristics
Evaluation Function
Hyper-heuristic
Data flow
Data flow
H1 H2 Hn
Heuristics to Choose Heuristics
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Choice Function
• f1 + f2 + f3
• f1 = How well has each heuristic performed
• f2 = How well have pairs of heuristics performed
• f3 = Time since last called
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
• Low level heuristics compete with each other
• Recent heuristics are made tabu
• Rank low level heuristics based on their estimated performance potential
Tabu Search
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
• Find heuristics that worked well in previous similar problem solving situations
• Features discovered in similarity measure – key research issue
Case Based Heuristic Selection
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
• Based on Squeaky Wheel Optimisation
• Consider constructive heuristics as orderings
• Adapt the ordering by a heuristic modifier according to the penalty imposed by certain features
• Generative
Adaptive Ordering Strategies
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
ContentsPast
• A selection of early work
Present (Generating Heuristics)
• Current State of the Art
Future
• Potential Research Directions for the Future
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
• Rather than supply a set of low level heuristics, generate the heuristics automatically
• Heuristics could be one off(disposal) heuristics or could be applicable to many problem instances
Generating heuristics
Domain Barrier
……
Set of low level heuristics
Evaluation Function
Hyper-heuristic
Data flow
Data flow
H1 H2 Hn
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Generating heuristics
Burke E. K., Hyde M. and Kendall G. Evolving Bin Packing
Heuristics With Genetic Programming. In Proceedings of the 9th
International Conference on Problem Parallel Solving from Nature
(PPSN 2006), pp 860-869, LNCS 4193, Reykjavik, Iceland, 9-13
Sepetmber 2006
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Generating heuristics
• Evolves a control program that decides whether to put a given piece into a given bin
• First-fit heuristic evolved from Genetic Programming without human input on benchmark instances
For each piece, p, not yet packed
For each bin, i
output = evaluate(p, fullness of i, capacity of i)
if (output > 0)
place piece p in bin i
break
fi
End For
End For
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Albert Einstein
1879 - 1955
ContentsPast
• A selection of early work
Present
• Current State of the Art
Future
• Potential Research Directions for the Future“We can't solve problems by using the
same kind of thinking we used when
we created them.”
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Results on Standard Datasets
•Many early papers investigated JSSP.
There is an opportunity to investigate if
the current state of the art is able to beat
these and set new benchmarks
•Why not apply hyper-heuristics to more
current benchmarks (TSP, VRP, QAP
etc.).
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Benchmark datasets
•We need to add to resources such as
OR-LIB so that we are able to compare
hyper-heuristic approaches.
•We need to have access to benchmarks
that are understandable, perceived as fair
and which are not open to many
interpretations.
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Comparison against benchmarks
•Using the “good enough, soon enough,
cheap enough” mantra we don’t claim to
be competitive with bespoke solutions,
but we are interested if we can beat best
known solutions.
•Why are some hyper-heuristics better
than others – and on what class of
problems?
•Robustness vs quality and how do we
measure that?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Ant Algorithm based Hyper-heuristics
•Ant algorithms draw their inspiration
from the way ants forage for food.
•Two major elements to an ant
algorithm.
•Pheromone values
•Heuristic values
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Ant Algorithm based hyper-heuristics
Trail
Intensity
Visibility
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Ant Algorithm based hyper-heuristics
Heuristic
Synergy
Visibility
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Different Evaluations
“Good enough, soon enough, cheap
enough”
•What does this actually mean?
•Will the scientific community accept
that this is a fair way to compare results?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Not Good Enough!
“Good enough, soon enough, cheap
enough”
•How do we know if a solution is “good
enough”?
•User feedback?
•Within a given value of best known
solution?
•We get bored running the
algorithm?
•The cost of accepting the solution is
acceptable?
•Two evaluation mechanisms?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Soon Enough!
“Good enough, soon enough, cheap
enough”
•How do we know if a solution is “soon
enough”?
•Meet a critical deadline?
•Run as long as we can?
•Can be embedded in a realtime
system?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Cheap Enough!
“Good enough, soon enough, cheap
enough”
•How do we know if a solution is
“cheap enough”?
•Can be embedded in “off-the-shelf”
software?
•Development costs are significantly
lower writing a bespoke system?
•Can be run on a standard PC, rather
than requiring specialised hardware?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Comparing Hyper-heuristics
•How can we compare different hyper-
heuristics so that reviewers have a way
of fairly judging new contributions
•What do we mean by “One hyper-
heuristic is better than another”?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Anti-heuristics
•There is/has been a significant amount
of research investigating how we can
“choose which heuristic to select at each
decision point”
•There could also be some benefit in
investigating hyper-heuristics that are
obviously bad and seeing if the hyper-
heuristic is able to learn/adapt not to use
them
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Minimal Heuristics
•Many of the hyper-heuristic papers
effectively say “choose a set of low level
heuristics…”
•But, can we define a minimal set of
heuristics that operate well across
different problems (e.g. add, delete and
swap)?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Evolve heuristics
•We can ignore “choose a set of low level
heuristics…” if we can generate our own
set of human competitive heuristics
•We have utilised genetic programming
and adaptive constructive heuristics but
there remains lots of scope for further
investigation.
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Co-evolution
•Heuristics compete for survival
•Similarities with genetic algorithms etc.,
but there is a wide scope of possible
research in this area.
Arthur Samuel
1901 – 1990
An AI Pioneer
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Hybridisations
•Is there anything to be gained from
hybridising various methodologies?
•There has been success with exact
methods and meta-heuristics
•What about hybridising hyper-heuristics
with meta-heuristics, exact approaches,
user interaction etc?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
User interaction
•How can users interact with hyper-
heuristics?
•Introduce/delete heuristics as the
search progresses?
•Prohibit some areas of the search
space?
•Provide a time/quality trade off?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Framework
•There is a large learning curve and high
buy-in to develop a hyper-heuristic
•Tools such as GA-LIB help the
community to utilise the tools and to
carry out research
•But, what should this framework enable
you to do? Choose heuristics, generate
heuristics?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Stephen Hawking
1942 -
A unifying theory
•What is the formal relationship between
heuristics, meta-heuristics and hyper-
heuristics (and even exact methods)?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Stephen Hawking
1942 -
A unifying theory
•Can we analyse the landscape of the
different search methodologies?
•Can we move between different search
spaces during the search?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Stephen Hawking
1942 -
A unifying theory
•Can we offer convergence guarantees?
•Can we offer guarantees of solution
quality and/or robustness?
Graham Kendall, Hyper-heuristics: Past, Present and Future (uploaded to Slideshare.com : 25th April 2010)
Th
e Un
iversity
of N
ottin
gh
am
Questions/Discussion