+ All Categories
Home > Documents > An Information-Theoretic Approach for Clonal Selection Algorithms

An Information-Theoretic Approach for Clonal Selection Algorithms

Date post: 11-May-2015
Category:
Upload: mario-pavone
View: 312 times
Download: 3 times
Share this document with a friend
Popular Tags:
37
Introduction An Optimization CSA KL d , R d and VN d Entropic Divergences Results Conclusions An Information-Theoretic Approach for Clonal Selection Algorithms V. Cutello, G. Nicosia, M. Pavone, G. Stracquadanio Department of Mathematics and Computer Science University of Catania Viale A. Doria 6, 95125 Catania, Italy (cutello, nicosia, mpavone, stracquadanio)@dmi.unict.it Seminar - DMI - Catania, 12 July 2010 An Information-Theoretic Approach for Clonal Selection Algorithms Mario Pavone [email protected] http://www.dmi.unict.it/mpavone/
Transcript
Page 1: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

An Information-Theoretic Approachfor Clonal Selection Algorithms

V. Cutello, G. Nicosia, M. Pavone, G. Stracquadanio

Department of Mathematics and Computer ScienceUniversity of Catania

Viale A. Doria 6, 95125 Catania, Italy(cutello, nicosia, mpavone, stracquadanio)@dmi.unict.it

Seminar - DMI - Catania, 12 July 2010

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 2: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Outline

1 IntroductionGlobal OptimizationNumerical Minimization ProblemArtificial Immune System

2 An Optimization CSAi-CSA operatorspseudo-code of CSA

3 KLd , Rd and VNd Entropic DivergencesLearning ProcessKullback-Leibler divergenceRényi generalized divergenceVon Neumann divergenceComparison of the divergences

4 Results

5 Conclusions

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 3: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Global Optimization

Global Optimization (GO): finding the best set ofparameters to optimize a given objective functionGO problems are quite difficult to solve: there existsolutions that are locally optimal but not globallyGO requires finding a setting ~x = (x1, x2, . . . , xn) ∈ S,where S ⊆ Rn is a bounded set on Rn, such that a certainn-dimensional objective function f : S → R is optimized.GOAL: findind a point ~xmin ∈ S such that f (~xmin) is a globalminimum on S, i.e. ∀~x ∈ S : f (~xmin) ≤ f (~x).

It is difficult to decide when a global (or local) optimum hasbeen reachedThere could be very many local optima where thealgorithm can be trapped

the difficulty increases proportionally with the problem dimension

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 4: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Numerical Minimization Problem

Let be ~x = (x1, x2, . . . , xn) the variables vector in Rn;~Bl = (Bl1 ,Bl2 , . . . ,Bln ) and ~Bu = (Bu1 ,Bu2 , . . . ,Bun ) the lowerand the upper bounds of the variables, such thatxi ∈

[Bli ,Bui

](i = 1, . . . ,n).

GOAL: minimizing f (~x) the objective function

min(f (~x)), ~Bl ≤ ~x ≤ ~Bu

The first 13 functions taken from [Yao et al., IEEE TEC, 1999],were used as benchmark to evaluate the performances andconvergence ability.These functions belong to two different categories: unimodal,and multimodal with many local optima.

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 5: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Artificial Immune System

Articial Immune Systems - AIS

Immune System (IS) is the main responsible to protect theorganism against the attack from external microorganisms,that might cause diseases;The biological IS has to assure recognition of eachpotentially dangerous molecule or substanceArtificial Immune Systems are a new paradigm of thebiologically-inspired computingThree immunological theory: immune networks, negativeselection, and clonal selectionAIS have been successfully employed in a wide variety ofdifferent applications

[Timmis et al.: J.Ap.Soft.Comp., BioSystems, Curr. Proteomics, 2008]

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 6: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Artificial Immune System

Clonal Selection Algorithms - CSA

CSA represents an effective mechanism for search andoptimization

[Cutello et al.: IEEE TEC & J. Comb. Optimization, 2007]

Cloning, Hypermutation and Aging operators: features keyof CSACloning: triggers the growth of a new population ofhigh-value B cells centered on a higher affinity valueHypermutation: can be seen as a local search procedurethat leads to a faster maturation during the learning phase.Aging: aims to generate diversity inside the population,and thus to avoid getting trapped in a local optima.

increasing or decreasing the allowed time to stay in thepopulation (δ) influences the convergence process

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 7: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Artificial Immune System

Learning Capability of CSA

Same classes of functions were used to analyze thelearning processThree relative entropies were used: Kullback-Leibler, Rényigeneralized and Von Neumann divergencesThe learning analysis was made studying the gain bothwith respect to the initial distribution, and the ones basedon the information obtained in previous step.

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 8: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA operators

Cloning Operator

Cloning operator clones each B cell dup times (P(clo))

Thanks to this operator CSA produces individuals with higheraffinities (higher fitness function values)

The strategy to set the age is crucial for the search inside thelandscape

What age to assign to each clone?

Each clone was assigned a random age chosen into the range[0, τB] (labelled IA) [Cutello et al.: SAC 2006 & ICANNGA 2007]

Same age of parents, whilst zero when the fitness of themutated clones is improved [Cutello et al.: IEEE TEC 2007 & CEC 2004]

The improved proposed version: choosing the age of each cloneinto the range [0, 2

3τB] (labelled i-CSA)An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 9: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA operators

Hypermutation Operator

Tries to mutate any B cell receptor: isn’t based on anexplicit usage of a mutation probability.There exist several different kinds of hypermutationoperator [Cutello et al.: ICARIS & CEC, 2004]

Inversely Proportional Hypermutation was designedMutation rate:

α = e−ρ f̂ (~x),

with f̂ (~x) the normalized fitness function in [0,1]Number of mutations:

M = b(α× `) + 1c,where ` is the length of the B cell

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 10: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA operators

Number of mutations produced

0

20

40

60

80

100

120

140

160

180

200

220

0 0.2 0.4 0.6 0.8 1

M

normalized fitness

Number of Mutations of the Inversely Hypermutation Operator

dim=30, ρ=3.5dim=50, ρ=4.0

dim=100, ρ=6.0dim=200, ρ=7.0

1

2

3

4

5

6

7

8

9

10

0.4 0.5 0.6 0.7 0.8 0.9 1

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 11: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA operators

How the Hypermutation Operator works

Choose randomly a variable xi (i ∈ {1, . . . , ` = n})Choose randomly an index j (j ∈ {1, . . . , ` = n}, and j 6= i)Replace xi in according to the following rule:

x(t+1)i =

((1 − β) x(t)

i

)+(β x(t)

j

),

Normalization of fitness: best current fitness value into thepopulation decreased of an user-defined threshold Θ,

This strategy is due to making CSA as blind as possible,since is not known a priori any additional informationconcerning the problem [Cutello et al.: SAC 2006]

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 12: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA operators

Aging Operator

Eliminates all old B cells, in the populations P(t)d , and P(hyp)

Nc

Depends on the parameter τB: maximum number ofgenerations allowedwhen a B cell is τB + 1 old it is erasedGOAL: produce an high diversity into the currentpopulation to avoid premature convergencestatic aging operator: when a B cell is erasedindependently from its fitness value qualityelitist static aging operator: the selection mechanism doesnot allow the elimination of the best B cell

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 13: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

pseudo-code of CSA

Clonal Selection Algorithm

CSA (d ,dup, ρ, τB,Tmax )fen := 0;Nc := d × dup;P(t=0)

d := init_pop(d);// xi = Bli + β(Bui − Bli ), – β ∈ [0, 1] is a real random value

comp_fit(P(t=0)d );

fen := fen + d ;while (fen < Tmax )do

P(clo)Nc := Cloning (P(t)

d , dup);P(hyp)

Nc := Hypermutation(P(clo)Nc , ρ);

comp_fit(P(hyp)Nc );

fen := fen + Nc;(Pa(t)

d ,Pa(hyp)Nc ) := aging(P(t)

d ,P(hyp)Nc , τB);

P(t+1)d := (µ+ λ)-selection(Pa(t)

d ,Pa(hyp)Nc );

t := t + 1;end_while

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 14: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Learning Process

To analyze the learning process were used three entropicdivergence metrics:

1 Kullback-Leibler (KLd ), or Information Gain2 Rényi generalized (Rd ),3 Von Neumann (VNd )

The learning process of CSA was studied with respect bothto the initial distribution, and the ones to the previous stepDistribution function:

f (t)m =Bt

m∑hm=0 Bt

m=

Btm

d,

Btm is the number of B cells at time step t with fitness

function value m

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 15: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Kullback-Leibler divergence

KLd or Information Gain

Is one of the most frequently used information-theoreticdistance measureIs based on two probability distributions of discrete randomvariable, called relative information, which found manyapplications in setting important theorems in informationtheory and statisticsmeasures the quantity of information the system discoversduring the learning phase [Cutello et al.: journal of Combinatorial

Optimization, 2007]

Formally defined as:

KLd(f(t)m , f (t0)

m ) =∑m

f (t)m log(f (t)m /f (t0)m )

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 16: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Kullback-Leibler divergence

Maximum KLd Principle

The gain is the amount of information the system hasalready learned during its search processOnce the learning process begins, the information gainincreases monotonically until it reaches a final steady stateMaximum Kullback-Leibler principle [Cutello et al., GECCO 2003]:

dKdt

≥ 0

The learning process will end when the above equation iszeroMaximum KLd principle as termination criterion [Cutello et al.,

GECCO 2003 & JOCO 2007]An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 17: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Kullback-Leibler divergence

KLd on f5, f7, and f10

0

5

10

15

20

25

1 2 4 8 16 32

Generations

Information Gain

f5f7

f10

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 18: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Kullback-Leibler divergence

KLd and standard deviation on f5

0

5

10

15

20

25

16 32 64

Generations

Clonal Selection Algorithm: i-CSA

0

50

100

150

200

250

300

16 32 64

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 19: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Kullback-Leibler divergence

Average fitness vs. best fitness on f5

0

5e+08

1e+09

1.5e+09

2e+09

2.5e+09

3e+09

3.5e+09

4e+09

0 2 4 6 8 10

Fitn

ess

Generations

Clonal Selection Algorithm: i-CSA

avg fitnessbest fitness

0

5

10

15

20

25

16 32 64

gainentropy

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 20: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Rényi generalized divergence

Rényi generalized divergence (Rd )

Formally defined as:

Rd(f(t)m , f (t0)

m , α) =1

α− 1log

(∑m

f (t)mα

f (t0)m

α−1

),

with α > 0 and α 6= 1[Rényi, proc. 4th Berkeley Symposium on Mathematics, Statisticsand Probability, 1961]

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 21: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Von Neumann divergence

Von Neumann divergence (VNd )

Formally defined as:

VNd(f(t)m , f (t0)

m ) = −1n

∑m

(f (t)m log f (t0)m )−

1n

∑m

(f (t)m log f (t)m )

[Kopp, et al., Annals of Physics, 2007]

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 22: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Comparison of the divergences

KLd vs. Rd and VNd , with respect t0

-5

0

5

10

15

20

25

30

16 32 64

Generations

Learning with respect to initial distribution (t0)

Von Neumann distribution (VNd)Renyi generalized

Kullback-Leibler entropy

0

0.2

0.4

0.6

0.8

1

1.2

16 32 64

VNd

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 23: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Comparison of the divergences

KLd vs. Rd and VNd , with respect (t − 1)

-5

0

5

10

15

20

25

30

16 32 64

Generations

Learning with respect (t-1) time step

Von Neumann divergence (VNd)Renyi generalized

Kullback-Leibler entropy

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

16 32 64

VNd

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 24: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Results

In order to study the search capability, and the ability toescape from a local optima, the first 13 functions of theclassical benchmark were taken into account from [Yao et al.,

IEEE TEC, 1999]

There are many evolutionary methodologies able to tackleeffectively the global numerical function optimization.Among such algorithms, differential evolution (DE) hasshown better performances than other evolutionaryalgorithms on complex and continuous search space

[Price, et al., journal of Global Optimization 1997, & Springer-Verlag2005]

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 25: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA vs. IA, and DE rand/1/bin [Thomsen, et al., CEC 2004]

IA i-CSA DE rand/1/bin30 variables

f1 0.0 0.0 0.00.0 0.0 0.0

f2 0.0 0.0 0.00.0 0.0 0.0

f3 0.0 0.0 2.02× 10−9

0.0 0.0 8.26× 10−10

f4 0.0 0.0 3.85× 10−8

0.0 0.0 9.17× 10−9

f5 12 0.0 0.013.22 0.0 0.0

f6 0.0 0.0 0.00.0 0.0 0.0

f7 1.521× 10−5 7.48 × 10−6 4.939× 10−3

2.05× 10−5 6.46 × 10−6 1.13× 10−3

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 26: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA vs. IA, and DE rand/1/bin [Thomsen, et al., CEC 2004]

IA i-CSA DE rand/1/bin30 variables

f8 −1.256041× 10+4 −9.05× 10+3 −1.256948 × 10+4

25.912 1.91× 104 2.3 × 10−4

f9 0.0 0.0 0.00.0 0.0 0.0

f10 0.0 0.0 −1.19 × 10−15

0.0 0.0 7.03 × 10−16

f11 0.0 0.0 0.00.0 0.0 0.0

f12 0.0 0.0 0.00.0 0.0 0.0

f13 0.0 0.0 −1.1428240.0 0.0 4.45 × 10−8

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 27: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA vs. IA, and DE rand/1/bin [Thomsen, et al., CEC 2004]

IA i-CSA DE rand/1/bin100 variables

f1 0.0 0.0 0.00.0 0.0 0.0

f2 0.0 0.0 0.00.0 0.0 0.0

f3 0.0 0.0 5.87× 10−10

0.0 0.0 1.83× 10−10

f4 6.447× 10−7 0.0 1.128× 10−9

3.338× 10−6 0.0 1.42× 10−10

f5 74.99 22.116 0.038.99 39.799 0.0

f6 0.0 0.0 0.00.0 0.0 0.0

f7 1.59× 10−5 1.2 × 10−6 7.664× 10−3

3.61× 10−5 1.53 × 10−6 6.58× 10−4

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 28: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA vs. IA, and DE rand/1/bin [Thomsen, et al., CEC 2004]

IA i-CSA DE rand/1/bin100 variables

f8 −4.16× 10+4 −2.727× 10+4 −4.1898 × 10+4

2.06× 10+2 7.63× 10−4 1.06 × 10−3

f9 0.0 0.0 0.00.0 0.0 0.0

f10 0.0 0.0 8.023× 10−15

0.0 0.0 1.74× 10−15

f11 0.0 0.0 5.42× 10−20

0.0 0.0 5.42× 10−20

f12 0.0 0.0 0.00.0 0.0 0.0

f13 0.0 0.0 −1.1428240.0 0.0 2.74 × 10−8

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 29: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA and IA vs. several DE variants [Coello et al., GECCO 2006]

Unimodal Functionsf1 f2 f3 f4 f6 f7

i-CSA 0.0 0.0 0.0 0.0 0.0 2.79 × 10−5

IA 0.0 0.0 0.0 0.0 0.0 4.89 × 10−5

DE rand/1/bin 0.0 0.0 0.02 1.9521 0.0 0.0DE rand/1/exp 0.0 0.0 0.0 3.7584 0.84 0.0DE best/1/bin 0.0 0.0 0.0 0.0017 0.0 0.0DE best/1/exp 407.972 3.291 10.6078 1.701872 2737.8458 0.070545DE current-to-best/1 0.54148 4.842 0.471730 4.2337 1.394 0.0DE current-to-rand/1 0.69966 3.503 0.903563 3.298563 1.767 0.0DE current-to-rand/1/bin 0.0 0.0 0.000232 0.149514 0.0 0.0DE rand/2/dir 0.0 0.0 30.112881 0.044199 0.0 0.0

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 30: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA and IA vs. several DE variants [Coello et al., GECCO 2006]

Multimodal Functionsf5 f9 f10 f11 f12 f13

i-CSA 16.2 0.0 0.0 0.0 0.0 0.0IA 11.69 0.0 0.0 0.0 0.0 0.0DE rand/1/bin 19.578 0.0 0.0 0.001117 0.0 0.0DE rand/1/exp 6.696 97.753938 0.080037 0.000075 0.0 0.0DE best/1/bin 30.39087 0.0 0.0 0.000722 0.0 0.000226DE best/1/exp 132621.5 40.003971 9.3961 5.9278 1293.0262 2584.85DE current-to-best/1 30.984666 98.205432 0.270788 0.219391 0.891301 0.038622DE current-to-rand/1 31.702063 92.263070 0.164786 0.184920 0.464829 5.169196DE current-to-rand/1/bin 24.260535 0.0 0.0 0.0 0.001007 0.000114DE rand/2/dir 30.654916 0.0 0.0 0.0 0.0 0.0

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 31: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Immune systems have been related to swarm systems,since many immune algorithms operate in a very similarmanneri-CSA was compared with some swarm intelligencealgorithms [Karaboga et al., journal of Global Optimization, 2007]:

1 particle swam optimization (PSO)2 particle swarm inspired evolutionary algorithm (PS-EA)3 artificial bee colony

Experimental protocol: n = {10,20,30} and as terminationcriterion 500, 750, and 1000 generationsfive functions were taken into account: f5, f9, f10 f11, andthe new function:

H(~x) = (418.9829× n) +n∑

i=1

−xi sin(√|xi |)

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 32: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA vs. three Swarm Intelligence algorithms [Karaboga et al., journal of

Global Optimization, 2007]

f11 f9 f5 f10 H

10 variablesGA 0.050228 1.3928 46.3184 0.59267 1.9519

0.029523 0.76319 33.8217 0.22482 1.3044PSO 0.079393 2.6559 4.3713 9.8499 × 10−13 161.87

0.033451 1.3896 2.3811 9.6202 × 10−13 144.16PS-EA 0.222366 0.43404 25.303 0.19209 0.32037

0.0781 0.2551 29.7964 0.1951 1.6185i-CSA 0.0 0.0 0.0 0.0 1.27 × 10−4

0.0 0.0 0.0 0.0 1.268 × 10−14

ABC1 0.00087 0.0 0.034072 7.8 × 10−11 1.27 × 10−9

0.002535 0.0 0.045553 1.16 × 10−9 4 × 10−12

ABC2 0.000329 0.0 0.012522 4.6 × 10−11 1.27 × 10−9

0.00185 0.0 0.01263 5.4 × 10−11 4 × 10−12

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 33: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA vs. three Swarm Intelligence algorithms [Karaboga et al., journal of

Global Optimization, 2007]

f11 f9 f5 f10 H

20 variablesGA 1.0139 6.0309 103.93 0.92413 7.285

0.026966 1.4537 29.505 0.22599 2.9971PSO 0.030565 12.059 77.382 1.1778 × 10−6 543.07

0.025419 3.3216 94.901 1.5842 × 10−6 360.22PS-EA 0.59036 1.8135 72.452 0.32321 1.4984

0.2030 0.2551 27.3441 0.097353 0.84612i-CSA 0.0 0.0 0.0 0.0 237.5652

0.0 0.0 0.0 0.0 710.4036ABC1 2.01 × 10−8 1.45 × 10−8 0.13614 1.6 × 10−11 19.83971

6.76 × 10−8 5.06 × 10−8 0.132013 1.9 × 10−11 45.12342

ABC2 0.0 0.0 0.014458 0.0 0.0002550.0 0.0 0.010933 1 × 10−12 0

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 34: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

i-CSA vs. three Swarm Intelligence algorithms [Karaboga et al., journal of

Global Optimization, 2007]

f11 f9 f5 f10 H

30 variablesGA 1.2342 10.4388 166.283 1.0989 13.5346

0.11045 2.6386 59.5102 0.24956 4.9534PSO 0.011151 32.476 402.54 1.4917 × 10−6 990.77

0.014209 6.9521 633.65 1.8612 × 10−6 581.14PS-EA 0.8211 3.0527 98.407 0.3771 3.272

0.1394 0.9985 35.5791 0.098762 1.6185i-CSA 0.0 0.0 0.0 0.0 2766.804

0.0 0.0 0.0 0.0 2176.288ABC1 2.87 × 10−9 0.033874 0.219626 3 × 10−12 146.8568

8.45 × 10−10 0.181557 0.152742 5 × 10−12 82.3144

ABC2 0.0 0.0 0.020121 0.0 0.0003820.0 0.0 0.021846 0.0 1 × 10−12

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 35: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Conclusion 1/3

Global numerical optimization has been taken into accountto prove the effectiveness of a derivative-free clonalselection algorithm (i-CSA)The main features of i-CSA can be summarized as:

1 cloning operator, which explores the neighbourhood of agiven solution

2 inversely proportional hypermutation operator, that perturbseach candidate solution as a function of its fitness functionvalue (inversely proportionally)

3 aging operator, that eliminates the oldest candidatesolutions from the current population in order to introducediversity and thus avoiding local minima during the searchprocess

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 36: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Conclusion 2/3

The decision of the age to assign to any B cell receptor is crucialfor the quality of the search inside the space of solutions

We have presented a simple variant of CSA able to effectivelyimprove its performances leaving a longer maturation for each Bcell

A large set of classical numerical functions was taken intoaccount

i-CSA was compared with several variants of the DE algorithm,since it has been shown to be effective on many optimizationproblems.

i-CSA was also compared with state-of-the-art swarmalgorithms.

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/

Page 37: An Information-Theoretic Approach for Clonal Selection Algorithms

Introduction An Optimization CSA KLd , Rd and VNd Entropic Divergences Results Conclusions

Conclusion 3/3

The analysis of the results shows that i-CSA is comparable, and oftenoutperforms, all nature-inspired algorithms in terms of accuracy to the accuracy,and effectiveness in solving large-scale instances

By analyzing one of the most difficult function of the benchmark, (f5), wecharacterize the learning capability of i-CSA.

The obtained gain has been analyzed both with respect the initial distribution,and the ones obtained in the previous step.

For this study, three different entropic metrics were used

1 Kullback-Leibler2 Rényi3 von Neumann

By the relative curves, is possible to observe a strong correlation between optima(peaks in the search landscape) discovering, and high relative entropy values.

An Information-Theoretic Approach for Clonal Selection Algorithms

Mario Pavone – [email protected] – http://www.dmi.unict.it/mpavone/


Recommended