MEMPSODE: Comparing Particle Swarm Optimization and Differential Evolution on a Hybrid Memetic Global Optimization Framework

Size: px
Start display at page:

Download "MEMPSODE: Comparing Particle Swarm Optimization and Differential Evolution on a Hybrid Memetic Global Optimization Framework"

Transcription

1 MEMPSODE: Comparing Particle Swarm Optimization and Dierential Evolution on a Hybrid Memetic Global Optimization Framework C. Voglis Computer Science Department University o Ioannina Ioannina, Greece voglis@cs.uoi.gr D. G. Papageorgiou Department o Materials Science and Engineering University o Ioannina dpapageo@cc.uoi.gr Drat version G.S. Piperagkas Computer Science Department University o Ioannina gpiperag@cs.uoi.gr I. E. Lagaris Computer Science Department University o Ioannina lagaris@cs.uoi.gr K. E. Parsopoulos Computer Science Department University o Ioannina kostasp@cs.uoi.gr ABSTRACT In this paper we present an experimental comparison between two well known population-based schemes, namely Particle Swarm Optimization (PSO) and Dierential Evolution (DE), that are incorporated in a memetic global optimization ramework. We use the recently published MEMP- SODE sotware [], that implements the memetic global optimization irst described in [] and incorporates Merlin optimization environment []. Since the original description o the algorithm in [] involved only a PSO variant or the exploration phase, using MEMPSODE sotware we attempt an empirical assessment o the DE. The results based on the noiseless testbed, indicate that the usage o DE may lead to superior perormance. Categories and Subject Descriptors G.. [Numerical Analysis]: Optimization global optimization, memetic algorithms, particle swarm optimization, dierential evolution, local search; G. [Mathematical Sotware] Submission deadline: March th. Corresponding author. Permission to make digital or hard copies o all or part o this work or personal or classroom use is granted without ee provided that copies are not made or distributed or proit or commercial advantage and that copies bear this notice and the ull citation on the irst page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior speciic permission and/or a ee. GECCO, July,, Philadelphia, USA. Copyright ACM 9----//...$.. Keywords Memetic global optimization, Particle swarm optimization, Dierential evolution, Benchmarking, Black-box optimization, Merlin optimization environment. INTRODUCTION Evolutionary Algorithms (EAs) and Swarm Intelligence (SI) approaches have been established as powerul optimization tools or solving optimization problems [,, ]. They are based on models that draw their inspiration rom physical systems. Based on natural selection and evolution schemes these algorithms exhibit remarkable capability o locating global solutions or optimization problems. The rise o EA and SI algorithms has sparked the development o a closely related category, namely the Memetic Algorithms (MAs). MAs constitute a class o hybrid meta heuristics that combine population based optimization algorithms with local search procedures [9]. The rationale behind their development was the necessity or powerul algorithms where the global exploration capability o EAs and SI approaches would be complemented with the eiciency and accuracy o classical local optimization techniques. In this work we examine the perormance o a memetic optimization sotware that incorporates Particle Swarm Optimization (PSO) or Dierential Evolution (DE) schemes or exploration in addition to powerul local optimization algorithms or exploitation. Our primary target is to determine the impact o the choice between UPSO and DE in the algorithmic ramework presented in []. By no means we are attempting an extensive benchmark o all MEMPSODE user deined parameters. Instead we use a deault set o parameters both on PSO and on DE and the same local search procedure.. ALGORITHM PRESENTATION The tested sotware ollows closely the PSO based memetic approaches reported in [] and extends them also to the DE

2 ramework. More speciically, the Uniied PSO (UPSO) approach [], which harnesses the strengths o standard local and global PSO variants is implemented. In both cases, direct calls to LS procedures are acilitated via the established Merlin optimization environment [], providing the ability to develop a variety o MAs. In order to present the algorithm implemented by MEMSPODE we provide summary or all key algorithms incorporated. A pseudo-code summarising the methodology is presented in Algorithm.. Uniied Particle Swarm Optimization PSO was introduced by Eberhart and Kennedy []. The main concept o the method includes a population, also called swarm, o search points, also called particles, searching or optimal solutions within the search space, simultaneously. The particles move in the search space by assuming an adaptable position shit, called velocity, at each iteration. Moreover, each particle retains in a memory the best position it has ever visited, i.e., the position with the lowest unction value. To each particle is assigned a neighborhood, which determines the indices o its mates that will share its experience. Obviously, the neighborhood scheme aects the low o inormation among the particles. Two well known neighborhood scheme have been used extensively. The local lbest scheme were each particle is assumed to communicate only with its mates with adjacent indices and the global gbest scheme were any new inormation (best position) is immediately communicated to every single particle in each iteration. Putting our description in a mathematical ramework, let us assume the n dimensional continuous optimization problem: min (x), () x X Rn where the search space X is an orthogonal hyperbox in n : X [l, r ] [l, r ] [l n, r n]. A swarm o N particles is a set o search points: S = {x, x,..., x N }, where the i th particle is deined as: x i = (x i, x i,..., x in) X, i =,,..., N. The velocity (position shit) o x i is denoted as: v i = (v i, v i,..., v in), i =,,..., N, and its best position as: p i = (p i, p i,..., p in) X, i =,,..., N. Let g i = arg min (p j), and t denote the algorithm s iteration counter. j N i The classical PSO model can be generalized in the UPSO scheme []. Following this scheme and i we assume that G (t+) i and L (t+) i denote the velocity update o x i in the gbest and lbest PSO model, respectively: [ ( ij = χ v (t) ij + cr p (t) ij [ ( L (t+) ij = χ v (t) ij + cr p (t) ij G (t+) x(t) ij x(t) ij ) + c r ( ) + c r ( where g is the index o the overall best particle, i.e.: g = arg min (pj), j=,...,n p (t) gj x(t) ij p (t) g i j x(t) ij )] (), )] (), then the particle is updated as ollows []: U (t+) ij = u G (t+) ij + ( u) L (t+) ij, () x (t+) ij = x (t) ij + U (t+) ij, () i =,,..., N, j =,,..., n. The parameter u [, ] is called the uniication actor and it balances the inluence (trade o) o the global and local velocity update. Obviously, the lbest PSO model is retrieved or u =, while or u = the gbest PSO model is obtained. All intermediate values produce combinations with diverse convergence properties.. Dierential Evolution The Dierential Evolution (DE) algorithm was introduced by Storn and Price [] as a population based stochastic optimization algorithm or numerical optimization problems. DE is ormulated similarly to PSO. A population: P = {x, x,..., x N }, o N individuals is utilized to probe the search space, X R n. The population is randomly initialized, usually ollowing a uniorm distribution within the search space. Each individual is an n dimensional vector: x i = (x i, x i,..., x in) X, i =,,..., N, serving as a candidate solution o the problem at hand. The population is iteratively evolved by applying two operators, mutation and recombination, on each individual to produce new candidate solutions. Then, the new and the old individuals are merged and selection takes place to construct the new population consisting o the N best individuals. The procedure continues in the same manner until a termination criterion is satisied. The mutation operator produces a new vector, v i, or each individual, x i, i =,,... N, by combining some o the rest individuals o the population. There most common (but not the only) operator proposed to accomplish this task: OP: v (t+) i = x (t) g + F ( ) x (t) r x (t) r, () where t denotes the iteration counter; F (, ] is a ixed user deined parameter; g denotes the index o the best individual in the population, i.e., the one with the lowest unction value; and r j {,,..., N}, j =,,...,, are mutually dierent randomly selected indices that dier also rom the index i. Thus, in order to be able to apply all mutation operators, it must hold that N >. Ater the mutation, a recombination operator is applied producing a trial vector: u (t+) ij = u i = (u i, u i,..., u in), i =,,..., N, or each individual. This vector is deined as ollows: v (t+) ij, i R j CR or j = RI(i), x (t) ij, i Rj > CR and j RI(i), () where j =,,..., n; R j is a random variable uniormly distributed in the range [, ]; CR [, ] is a user deined crossover constant; and RI(i) {,,..., n}, is a randomly selected index. Finally, each trial vector is compared against the corresponding individual and the best between them comprise

3 the new individual in the next generation, i.e.: ( ) ( ) u (t+) x (t+) i, i u (t+) i < x (t) i, i = x (t) i, otherwise.. Local search A local solution to an optimization problem can be obtained by applying local optimization methods. The hybrid schemes implemented in MEMPSODE have a need or deterministic local search procedures that require a starting point, x, and generate a sequence o points, {x k } k=, in order to determine a minimizer within a prescribed accuracy. The generation o a new point, x k+, in the sequence is based on inormation collected or the current iterate, x k. Typically, this inormation includes the unction value at x k, as well as the irst and probably second order derivatives o (x) at x k. In all cases, the aim is to ind a new iterate with lower unction value than the current one. The Merlin optimization environment [] is an eicient and robust general purpose optimization package. It is designed to solve multi dimensional optimization problems. Merlin oers a variety o well established gradient based and gradient ree optimization algorithms. Gradient based algorithms include three methods rom the conjugate gradient amily, the method o Levenberg-Marquardt, the DFP and several variations o the BFGS algorithms (BFGS) [?]. The gradient ree algorithms include a pattern search and the nonlinear Simplex method.. Memetic Algorithm The design o MPSO in [] was based on three undamental schemes, henceorth called the memetic strategies: Scheme : LS is applied only on the overall best position, p g, o the swarm. Scheme : LS is applied on each locally best position, p i, i =,,..., N, with a prescribed ixed probability, ρ (, ]. Scheme : LS is applied both on the best position, p g, as well as on some randomly selected localy best positions, p i, i {,,..., N}. These schemes can be applied either at each iteration or whenever a speciic number o consecutive iterations has been completed. O course, many other memetic strategies can be considered. For instance, a simple one would be the application o LS on every particle. However, such an approach would be costly in terms o unction evaluations. In practice, only a small number o particles are considered as start points or LS, as pointed out in []. The memetic strategies proposed in [] were also adopted in MEMPSODE or both PSO and DE based MAs.. EXPERIMENTAL PROCEDURE We used the deault restart mechanism provided by the testbed or a maximum number o n unction evaluations. The third memetic scheme was used with probability o local search set to p i =.. Both UPSO and DE used a swarm size N = particles. In UPSO the uniication actor u was set to and the initial velocity vector was restraint by a actor o.. For the DE experiments we applied OP with deault values F =. and CR =.. () Each local seach has an upper limit o unction evaluations. Whenever derivatives were needed (eg. BFGS) we applied an O(h) inite dierences ormula where h is an adaptable step size(see []). For the local search we applied the BFGS method implemented in Merlin. The experiments have been conducted on an Intel I- processor on. GHz with GB RAM.. RESULTS Results rom experiments according to [] on the benchmark unctions given in [, ] are presented in Figures, and and in Tables. The expected running time (ERT), used in the igures and table, depends on a given target unction value, t = opt +, and is computed over all relevant trials as the number o unction evaluations executed during each trial while the best unction value did not reach t, summed over all trials and divided by the number o trials that actually reached t []. A direct comparison between UPSO and DE variants o MEMPSODE memetic algorithm can be deduced by observing the scatter plots in igure and the starred records in table. In the separable case DE variant outperorms UPSO especially as dimensionality increases. For the moderate category DE seems to outperorm only on unction (step ellipsoid) and scores marginally better in all other cases. The same behaviour is repeated or the ill-conditioned cases where DE variant is slightly better. Finally, DE variant outperorms PSO variant in all multimodal unctions but PSO variant seems to behave better or the weak structured cases. DE variant superiority is also obvious by inspecting igure. In almost all cases (except the weak structured unctions) the ECDF o DE variant lies higher than the corresponding ECDF o PSO variant and this pattern is repeated in all levels o accuracy. It is also worth mentioning that the DE variant scored the best recorded ETF or some accuracy levels in the case o ill condition unctions (-). From the corresponding lines o table we can see that or relatively low levels o accuracy the achieved ERT scores are quite competitive. As a general remark, MEMPSODE seems a very promising and competitive new algorithm that incorporates state o the art schemes o swarm intelligence algorithms with the robust and versatile Merlin optimization environment.. REFERENCES [] R. C. Eberhart and J. Kennedy. A new optimizer using particle swarm theory. In Proceedings Sixth Symposium on Micro Machine and Human Science, pages 9, Piscataway, NJ, 99. IEEE Service Center. [] A. P. Engelbrecht. Fundamentals o Computational Swarm Intelligence. Wiley,. [] S. Finck, N. Hansen, R. Ros, and A. Auger. Real-parameter black-box optimization benchmarking 9: Presentation o the noiseless unctions. Technical Report 9/, Research Center PPE, 9. Updated February. [] N. Hansen, A. Auger, S. Finck, and R. Ros. Real-parameter black-box optimization benchmarking : Experimental setup. Technical report, INRIA,.

4 Sphere Ellipsoid separable Rastrigin separable Skew Rastrigin-Bueche separ pso-bgs de-bgs target=e- Linear slope target=e- Attractive sector target=e- Step-ellipsoid target=e- Rosenbrock original target=e- 9 Rosenbrock rotated target=e- Ellipsoid target=e- Discus target=e- Bent cigar target=e- Sharp ridge target=e- Sum o dierent powers target=e- Rastrigin target=e- Weierstrass target=e- Schaer F, condition target=e- Schaer F, condition target=e- 9 Griewank-Rosenbrock FF target=e- Schweel x*sin(x) target=e- Gallagher peaks target=e- Gallagher peaks target=e- Katsuuras target=e- Lunacek bi-rastrigin pso-bgs de-bgs target=e- target=e- target=e- target=e- Figure : Expected running time (ERT in number o -evaluations) divided by dimension or target unction value as log values versus dimension. Dierent symbols correspond to dierent algorithms given in the legend o and. Light symbols give the maximum number o unction evaluations rom the longest trial divided by dimension. Horizontal lines give linear scaling, slanted dotted lines give quadratic scaling. Black stars indicate statistically better result compared to all other algorithms with p <. and Bonerroni correction number o dimensions (six). Legend: : pso-bgs, : de-bgs.

5 Sphere Sphere Linear slope Linear slope 9 Rosenbrock rotated 9 Rosenbrock rotated Sharp ridge Sharp ridge Schaer F, cond. Schaer F, condition Gallagher peaks Gallagher peaks Ellipsoid separable Ellipsoid separable Attractive sector Attractive sector Ellipsoid Ellipsoid Sum o di. powers Sum o dierent powers Schaer F, cond. Schaer F, condition Gallagher peaks Gallagher peaks Rastrigin separable Rastrigin separable Step-ellipsoid Step-ellipsoid Discus Discus Rastrigin Rastrigin 9 Griewank-Rosenbrock 9 Griewank-Rosenbrock FF Katsuuras Katsuuras Skew Rastrigin-Bueche Skew Rastrigin-Bueche separ Rosenbrock original Rosenbrock original Bent cigar Bent cigar Weierstrass Weierstrass Schweel x*sin(x) Schweel x*sin(x) Lunacek bi-rastrigin Lunacek bi-rastrigin Figure : Expected running time (ERT in log o number o unction evaluations) o pso-bgs (x-axis) versus de-bgs (y-axis) or target values [, ] in each dimension on unctions. Markers on the upper or right edge indicate that the target value was never reached. Markers represent dimension: :+, :, :, :, :, :.

6 all unctions o trials D -D +: / -: / -: / -: /9 o trials : 9/ -: /9 -: / -: 9/ separable cts moderate cts ill-conditioned cts multi-modal cts weak structure cts o trials o trials o trials o trials o trials. log o FEvals / DIM log o FEvals / DIM log o FEvals / DIM log o FEvals / DIM log o FEvals / DIM log o FEvals / DIM log o FEvals ratio +: / -: / -: / -: / log o FEvals ratio +: / -: / -: / -: / log o FEvals ratio +: / -: / -: / -: / log o FEvals ratio +: / -: / -: / -: / log o FEvals ratio +: / -: / -: / -: / log o FEvals ratio o trials o trials o trials o trials o trials. log o FEvals / DIM log o FEvals / DIM log o FEvals / DIM log o FEvals / DIM log o FEvals / DIM log o FEvals / DIM log o FEvals ratio +: / -: / -: / -: / log o FEvals ratio +: / -: / -: / -: / log o FEvals ratio +: / -: / -: / -: / log o FEvals ratio +: / -: / -: / -: / log o FEvals ratio +: / -: / -: / -: / log o FEvals ratio Figure : Empirical cumulative distributions (ECDF) o run lengths and speed-up ratios in -D (let) and - D (right). Let sub-columns: ECDF o the number o unction evaluations divided by dimension D (FEvals/D) to reach a target value opt + with = k, where k {,,, } is given by the irst value in the legend, or pso-bgs ( ) and de-bgs ( ). Light beige lines show the ECDF o FEvals or target value = o all algorithms benchmarked during BBOB-9. Right sub-columns: ECDF o FEval ratios o pso-bgs divided by de-bgs, all trial pairs or each unction. Pairs where both trials ailed are disregarded, pairs where one trial ailed are visible in the limits being > or <. The legends indicate the number o unctions that were solved in at least one trial (pso-bgs irst).

7 -D -D e+ e- e- e- e- #succ e+ e- e- e- e- #succ / / : pso.9(.).... / : pso.().(.).(.).(.).(.) / : de.(.).... / : de.().9(.).9(.).9(.).9(.) / / / : pso.().().().() () / : pso.(.).(.).9(.).(.).(.) / : de.().().().() () / : de.(.).(.).(.).9(.).(.) / / / : pso.e / : pso 9.() () (9) () () / : de.().().().().() : de () / (9) () () 9(9) / 9 9 /.e 9/ : pso.e / : pso () (9) () (99) (9) / : de.() () 9() () () : de () / 9() (9) (9) () / / / : pso () () () () () / : pso.(.).(.).9(.).9(.).9(.) / : de () () () () () / : de.() 9.() 9.() 9.() 9.() / / 9 9 / : pso 9.() () (9) 9() (9) / : pso.(9).().9().() (9) / : de.() () () 9() 9() / : de.().(.9).(.).().() / 9 / 9 99 / : pso.e / : pso9(9) 9(9) (9) () () / : de.9() () 9.() 9.() () : de () /.e / 9 / 9 9 / : pso.().().().() 9.() / : pso.().(.).(.).(.).(.) / : de.().().().().() / : de.().(.).(.).(.).(.) / 9 9 / 9 9 / : pso.(.).().().9().9() / : pso.().(.9).(.).(.).(.) / : de.(.).9(.9).(.).(.).(.) / : de.().().().().() / 9 9 / 9 / : pso.().(.9).9(.).(.).(.) / : pso 9() () () () () / : de.(.).9(.).(.).9(.).(.9) : de.().().().().() / / / 9 / : pso.(.).(.).(.).9(.).(.) : pso.(.).(.).(.).(.).(.) / / : de.(.).(.).(.).(.).(.) : de.(.).(.).(.).(.).(.) / / 9 / / : pso.().().().() () / : pso.().().().(.).(.9) / : de.(.).().().(.) : de.().9().9().(.).(.) /.() / / 9 / : pso.99(.).(.).(.) : pso.(.).(.) ().e /.(.) ().e / : de.(.).(.).(.).(.) 9(9) : de.(.).(.) /.(.) 9().e / 9 / 9 / : pso.().(.).9(.).(.) () / : pso.(.).9(.).(.).(.) 9(9) / : de.().(.).9(.).(.).() / : de.(.).9(.).(.).(.) () / /.e.e.e.e / : pso () 9() () () () / : pso.e / : de.().().().().() / : de ().e / 9 9 /.9e.e.e / : pso.9() (9) () 9() 9() / : pso ().e / : de.() (9) () () () / : de 9().e / / / : pso.(.) (9) () 9.() 9(99) / : pso 9().e / : de.() () ().9() () / : de (9) (9) 9().e / / 9 9.e.e / : pso () () () ().e / : pso (9).e / : de.() () 9() () () / : de 9() (9).e / 9.e.e.e / 9.e.e.e.e / : pso 9(9) ().().() 9(9) / : pso().().e / : de () ().().() () / : de ().().e / / : pso.9(.9).().().().() /.e.e.e.e / : pso.().e / : de.() ().().(9).() / : de.().e / 9 / : pso.().().().() 9() / 9 / : pso.().(.9).(.9).() 9(9) / : de.().().().() () / : de () () () () () / 9 / : pso.().().().() () / 9 9.e / : de.() () () () () / : pso.() () 9.().() 9() / : de.() () () ().e /. 9 / : pso.().() ().e /..9e.e.e / : de.().().(.9).(.9) 9() : pso.() ().e / / : de.() ().e /.e 9.e.e.e / : pso.().(.).(.).(.).(.) /.e.e.e.e.e / : pso.e / : de.().9(.).(.).(.).(.) / : de.().e / Table : ERT in number o unction evaluations divided by the best ERT measured during BBOB-9 given in the respective irst row and the hal inter-%ile in brackets or dierent values. #succ is the number o trials that reached the inal target opt+. :pso is pso-bgs and :de is de-bgs. Bold entries are statistically signiicantly better compared to the other algorithm, with p =. or p = k where k {,,,... } is the number ollowing the symbol, with Bonerroni correction o. A indicates the same tested against the best BBOB-9.

8 [] N. Hansen, S. Finck, R. Ros, and A. Auger. Real-parameter black-box optimization benchmarking 9: Noiseless unctions deinitions. Technical Report RR-9, INRIA, 9. Updated February. [] W. E. Hart. Adaptive Global Optimization with Local Search. PhD thesis, University o Caliornia, San Diego,USA, 99. [] J. Kennedy and R. C. Eberhart. Particle swarm optimization. In Proc. IEEE Int. Con. Neural Networks, volume IV, pages 9 9, Piscataway, NJ, 99. IEEE Service Center. [] J. Kennedy and R. C. Eberhart. Swarm Intelligence. Morgan Kaumann Publishers,. [9] P. Moscato. Memetic algorithms: A short introduction. In D. Corne, M. Dorigo, and F. Glover, editors, New Ideas in Optimization, pages 9. McGraw Hill, London, 999. [] D. Papageorgiou, I. Demetropoulos, and I. Lagaris. MERLIN-... A new version o the Merlin optimization environment. Computer Physics Communications, 9():,. [] K. E. Parsopoulos and M. N. Vrahatis. UPSO: A uniied particle swarm optimization scheme. In Lecture Series on Computer and Computational Sciences, Vol., Proceedings o the International Conerence o Computational Methods in Sciences and Engineering (ICCMSE ), pages. VSP International Science Publishers, Zeist, The Netherlands,. [] K. E. Parsopoulos and M. N. Vrahatis. Particle Swarm Optimization and Intelligence: Advances and Applications. Inormation Science Publishing (IGI Global),. [] Y. G. Petalas, K. E. Parsopoulos, and M. N. Vrahatis. Memetic particle swarm optimization. Annals o Operations Research, ():99,. [] R. Storn and K. Price. Dierential evolution a simple and eicient heuristic or global optimization over continuous spaces. J. Global Optimization, : 9, 99. [] C. Voglis, P. Hadjidoukas, I. Lagaris, and D. Papageorgiou. A numerical dierentiation library exploiting parallel architectures. Computer Physics Communications, ():, 9. [] C. Voglis, K. Parsopoulos, D. Papageorgiou, I. Lagaris, and M. Vrahatis. Mempsode: A global optimization sotware based on hybridization o population-based algorithms and local searches. Computer Physics Communications, ():9,. Algorithm : Pseudocode o the implemented memetic algorithm Input: Objective unction, : S R n R; algorithm PSO/DE: algo ; swarm size: N; memetic strategy: memetic, maximum unction evaluations: maxev; uniication actor: UF; use mutation: mut; probability or local search: ρ Output: Best detected solution: x, (x ). // Initialization or i =,,..., N do Initialize position x i and velocity u i Set p i x i // Initialize best position i (x i()) // Evaluate particle p i i // Best position local i // Best position is minimum is set to alse end // Update Best Indices Calculate global best index g and local best index g // Main Iteration Loop 9 Set t while termination criterion do // Update Swarm i algo = pso then or i =,,..., N do Calculate local best velocity update or particle u l i using g Calculate global best velocity update or particle u g i using g i mut = then // Uniied PSO with mutation R N(µ, σ) i rand(). then u i RUFu l i + ( UF)ug i // Uniied PSO + Mutate local term 9 else u i UFu l i + R( UF)ug i // Uniied PSO + Mutate global term end else u i UFu l i + ( UF)ug i // Uniied PSO end x i = x i + u i // Update particle s position end else i algo = de then or i =,,..., N do 9 x i p i // Replicate best positions p to swarm array x end or i =,,..., N do Calculate u i using a strategy rom Eqs. () (??) i rand() C then x i u i else x i p i end end 9 end // Evaluate Swarm or i =,,..., N do i (x i) // Evaluate particle end // Update Best Positions or i =,,..., N do i i < (p i) then p i x i p i i end end 9 Calculate global best index g and local best index g Apply one o the schemes using ρ Calculate global best index g and local best index g // I all best positions are local minima, restart i N i= locali = N then Keep global best particle and reinitialize the swarm end end

Adapt-MEMPSODE: A Memetic Algorithm with Adaptive Selection of Local Searches

Adapt-MEMPSODE: A Memetic Algorithm with Adaptive Selection of Local Searches Adapt-MEMPSODE: A Memetic Algorithm with Adaptive Selection of Local Searches Costas Voglis Department of Computer Science University of Ioannina GR, Ioannina, Greece voglis@cs.uoi.gr ABSTRACT MEMPSODE

More information

BBOB-Benchmarking the Rosenbrock s Local Search Algorithm

BBOB-Benchmarking the Rosenbrock s Local Search Algorithm BBOB-Benchmarking the Rosenbrock s Local Search Algorithm Petr Pošík Czech Technical University in Prague, Faculty of Electrical Engineering, Dept. of Cybernetics Technická, Prague posik@labe.felk.cvut.cz

More information

AMaLGaM IDEAs in Noisy Black-Box Optimization Benchmarking

AMaLGaM IDEAs in Noisy Black-Box Optimization Benchmarking AMaLGaM IDEAs in Noisy Black-Box Optimization Benchmarking Peter A.N. Bosman Centre for Mathematics and Computer Science P.O. Box GB Amsterdam The Netherlands Peter.Bosman@cwi.nl Jörn Grahl Johannes Gutenberg

More information

BBOB: Nelder-Mead with Resize and Halfruns

BBOB: Nelder-Mead with Resize and Halfruns BBOB: Nelder-Mead with Resize and Halfruns Benjamin Doerr Max-Planck-Institut für Informatik Campus E. Mahmoud Fouz Max-Planck-Institut für Informatik Campus E. Magnus Wahlström Max-Planck-Institut für

More information

Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts

Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts Nikolaus Hansen Microsoft Research INRIA Joint Centre rue Jean Rostand 99 Orsay Cedex, France Nikolaus.Hansen@inria.fr ABSTRACT

More information

Testing the Impact of Parameter Tuning on a Variant of IPOP-CMA-ES with a Bounded Maximum Population Size on the Noiseless BBOB Testbed

Testing the Impact of Parameter Tuning on a Variant of IPOP-CMA-ES with a Bounded Maximum Population Size on the Noiseless BBOB Testbed Testing the Impact of Parameter Tuning on a Variant of IPOP-CMA-ES with a Bounded Maximum Population Size on the Noiseless BBOB Testbed ABSTRACT Tianjun Liao IRIDIA, CoDE, Université Libre de Bruxelles

More information

An Evaluation of Sequential Model-Based Optimization for Expensive Blackbox Functions

An Evaluation of Sequential Model-Based Optimization for Expensive Blackbox Functions An Evaluation of Sequential Model-Based Optimization for Expensive Blackbox Functions Frank Hutter Freiburg University Department of Computer Science fh@informatik.unifreiburg.de Holger Hoos University

More information

Benchmarking Cellular Genetic Algorithms on the BBOB Noiseless Testbed

Benchmarking Cellular Genetic Algorithms on the BBOB Noiseless Testbed Benchmarking Cellular Genetic Algorithms on the BBOB Noiseless Testbed ABSTRACT Neal Holtschulte Dept. of Computer Science University of New Mexico Albuquerque, NM 87 + (0) 77-8 neal.holts@cs.unm.edu In

More information

Comparison of the MATSuMoTo Library for Expensive Optimization on the Noiseless Black-Box Optimization Benchmarking Testbed

Comparison of the MATSuMoTo Library for Expensive Optimization on the Noiseless Black-Box Optimization Benchmarking Testbed Comparison of the MATSuMoTo Library for Expensive Optimization on the Noiseless Black-Box Optimization Benchmarking Testbed Dimo Brockhoff To cite this version: Dimo Brockhoff. Comparison of the MATSuMoTo

More information

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE Fang Wang, and Yuhui Qiu Intelligent Software and Software Engineering Laboratory, Southwest-China Normal University,

More information

Benchmarking MATLAB s gamultiobj (NSGA-II) on the Bi-objective BBOB-2016 Test Suite

Benchmarking MATLAB s gamultiobj (NSGA-II) on the Bi-objective BBOB-2016 Test Suite Benchmarking MATLAB s gamultiobj (NSGA-II) on the Bi-objective BBOB- Test Suite Inria Saclay Ile-de-France TAO team, France LRI, Univ. Paris-Sud firstname.lastname@inria.fr Anne Auger Dimo Brockhoff Nikolaus

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

BBOB Black-Box Optimization Benchmarking with CoCO (Comparing Continuous Optimizers) The Turbo-Intro

BBOB Black-Box Optimization Benchmarking with CoCO (Comparing Continuous Optimizers) The Turbo-Intro with CoCO (Comparing Continuous Optimizers) The Turbo-Intro Black-Box Optimization (Search) CoCO: the noiseless functions 24 functions within five sub-groups Separable functions Essential unimodal functions

More information

Restarted Local Search Algorithms for Continuous Black-Box Optimization

Restarted Local Search Algorithms for Continuous Black-Box Optimization Restarted Local Search Algorithms for Continuous Black-Box Optimization Petr Pošík posik@labe.felk.cvut.cz Faculty of Electrical Eng., Czech Technical University in Prague, Czech Republic Waltraud Huyer

More information

Towards an FCA-based Recommender System for Black-Box Optimization

Towards an FCA-based Recommender System for Black-Box Optimization Towards an FCA-based Recommender System for Black-Box Optimization Josefine Asmus 1,3, Daniel Borchmann 2, Ivo F. Sbalzarini 1,3, and Dirk Walther 2,3 1 MOSAIC Group, Center for Systems Biology Dresden

More information

Benchmarking Cellular Genetic Algorithms on the BBOB Noiseless Testbed

Benchmarking Cellular Genetic Algorithms on the BBOB Noiseless Testbed Benchmarking Cellular Genetic Algorithms on the BBOB Noiseless Testbed ABSTRACT Neal Holtschulte Dept. of Computer Science University of New Mexico Albuquerque, NM 87131 +1 (505) 277-8432 neal.holts@cs.unm.edu

More information

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Nasser Sadati Abstract Particle Swarm Optimization (PSO) algorithms recently invented as intelligent optimizers with several highly

More information

Modified Particle Swarm Optimization

Modified Particle Swarm Optimization Modified Particle Swarm Optimization Swati Agrawal 1, R.P. Shimpi 2 1 Aerospace Engineering Department, IIT Bombay, Mumbai, India, swati.agrawal@iitb.ac.in 2 Aerospace Engineering Department, IIT Bombay,

More information

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm Oğuz Altun Department of Computer Engineering Yildiz Technical University Istanbul, Turkey oaltun@yildiz.edu.tr

More information

Particle Swarm Optimization

Particle Swarm Optimization Dario Schor, M.Sc., EIT schor@ieee.org Space Systems Department Magellan Aerospace Winnipeg Winnipeg, Manitoba 1 of 34 Optimization Techniques Motivation Optimization: Where, min x F(x), subject to g(x)

More information

An Evolutionary Algorithm for Minimizing Multimodal Functions

An Evolutionary Algorithm for Minimizing Multimodal Functions An Evolutionary Algorithm for Minimizing Multimodal Functions D.G. Sotiropoulos, V.P. Plagianakos and M.N. Vrahatis University of Patras, Department of Mamatics, Division of Computational Mamatics & Informatics,

More information

A Novel Accurate Genetic Algorithm for Multivariable Systems

A Novel Accurate Genetic Algorithm for Multivariable Systems World Applied Sciences Journal 5 (): 137-14, 008 ISSN 1818-495 IDOSI Publications, 008 A Novel Accurate Genetic Algorithm or Multivariable Systems Abdorreza Alavi Gharahbagh and Vahid Abolghasemi Department

More information

Introduction to Optimization: Benchmarking

Introduction to Optimization: Benchmarking Introduction to Optimization: Benchmarking September 13, 2016 TC2 - Optimisation Université Paris-Saclay, Orsay, France Anne Auger Inria Saclay Ile-de-France Dimo Brockhoff Inria Saclay Ile-de-France Anne

More information

Archived. all other. uses, in any

Archived. all other. uses, in any Archived in ANU Research repository http: //www.anu.edu.au/research/access/ This is the accepted version of: Bolufé-Röhler, Antonio et al (2013) Differential evolution with thresheld convergence. Paper

More information

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Dervis Karaboga and Bahriye Basturk Erciyes University, Engineering Faculty, The Department of Computer

More information

Binary Differential Evolution Strategies

Binary Differential Evolution Strategies Binary Differential Evolution Strategies A.P. Engelbrecht, Member, IEEE G. Pampará Abstract Differential evolution has shown to be a very powerful, yet simple, population-based optimization approach. The

More information

SIMULATION OPTIMIZER AND OPTIMIZATION METHODS TESTING ON DISCRETE EVENT SIMULATIONS MODELS AND TESTING FUNCTIONS

SIMULATION OPTIMIZER AND OPTIMIZATION METHODS TESTING ON DISCRETE EVENT SIMULATIONS MODELS AND TESTING FUNCTIONS SIMULATION OPTIMIZER AND OPTIMIZATION METHODS TESTING ON DISCRETE EVENT SIMULATIONS MODELS AND TESTING UNCTIONS Pavel Raska (a), Zdenek Ulrych (b), Petr Horesi (c) (a) Department o Industrial Engineering

More information

Tracking Changing Extrema with Particle Swarm Optimizer

Tracking Changing Extrema with Particle Swarm Optimizer Tracking Changing Extrema with Particle Swarm Optimizer Anthony Carlisle Department of Mathematical and Computer Sciences, Huntingdon College antho@huntingdon.edu Abstract The modification of the Particle

More information

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Richa Agnihotri #1, Dr. Shikha Agrawal #1, Dr. Rajeev Pandey #1 # Department of Computer Science Engineering, UIT,

More information

KANGAL REPORT

KANGAL REPORT Individual Penalty Based Constraint handling Using a Hybrid Bi-Objective and Penalty Function Approach Rituparna Datta Kalyanmoy Deb Mechanical Engineering IIT Kanpur, India KANGAL REPORT 2013005 Abstract

More information

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

A Hybrid Fireworks Optimization Method with Differential Evolution Operators A Fireworks Optimization Method with Differential Evolution Operators YuJun Zheng a,, XinLi Xu a, HaiFeng Ling b a College of Computer Science & Technology, Zhejiang University of Technology, Hangzhou,

More information

EE 553 Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over

EE 553 Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over EE Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over Emre Uğur February, 00 Abstract In this work, Particle Swarm Optimization (PSO) method is implemented and applied to various

More information

Evolutionary operators in global optimization with dynamic search trajectories

Evolutionary operators in global optimization with dynamic search trajectories Numerical Algorithms 34: 393 403, 2003. 2003 Kluwer Academic Publishers. Printed in the Netherlands. Evolutionary operators in global optimization with dynamic search trajectories E.C. Laskari, K.E. Parsopoulos

More information

A Proposed Approach for Solving Rough Bi-Level. Programming Problems by Genetic Algorithm

A Proposed Approach for Solving Rough Bi-Level. Programming Problems by Genetic Algorithm Int J Contemp Math Sciences, Vol 6, 0, no 0, 45 465 A Proposed Approach or Solving Rough Bi-Level Programming Problems by Genetic Algorithm M S Osman Department o Basic Science, Higher Technological Institute

More information

Initializing the Particle Swarm Optimizer Using the Nonlinear Simplex Method

Initializing the Particle Swarm Optimizer Using the Nonlinear Simplex Method Initializing the Particle Swarm Optimizer Using the Nonlinear Simplex Method K.E. PARSOPOULOS, M.N. VRAHATIS Department of Mathematics University of Patras University of Patras Artificial Intelligence

More information

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem Available online at www.sciencedirect.com Procedia Engineering 15 (2011) 3673 3677 Advanced in Control Engineeringand Information Science Application of Improved Discrete Particle Swarm Optimization in

More information

Hybrid Optimization Coupling Electromagnetism and Descent Search for Engineering Problems

Hybrid Optimization Coupling Electromagnetism and Descent Search for Engineering Problems Proceedings of the International Conference on Computational and Mathematical Methods in Science and Engineering, CMMSE 2008 13 17 June 2008. Hybrid Optimization Coupling Electromagnetism and Descent Search

More information

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This

More information

336 THE STATISTICAL SOFTWARE NEWSLETTER where z is one (randomly taken) pole of the simplex S, g the centroid of the remaining d poles of the simplex

336 THE STATISTICAL SOFTWARE NEWSLETTER where z is one (randomly taken) pole of the simplex S, g the centroid of the remaining d poles of the simplex THE STATISTICAL SOFTWARE NEWSLETTER 335 Simple Evolutionary Heuristics for Global Optimization Josef Tvrdk and Ivan Krivy University of Ostrava, Brafova 7, 701 03 Ostrava, Czech Republic Phone: +420.69.6160

More information

Optimized Algorithm for Particle Swarm Optimization

Optimized Algorithm for Particle Swarm Optimization Optimized Algorithm for Particle Swarm Optimization Fuzhang Zhao Abstract Particle swarm optimization (PSO) is becoming one of the most important swarm intelligent paradigms for solving global optimization

More information

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an Proceedings of International Joint Conference on Neural Netorks, Atlanta, Georgia, USA, June -9, 9 Netork-Structured Particle Sarm Optimizer Considering Neighborhood Relationships Haruna Matsushita and

More information

Benchmarking the Local Metamodel CMA-ES on the Noiseless BBOB 2013 Test Bed

Benchmarking the Local Metamodel CMA-ES on the Noiseless BBOB 2013 Test Bed Benchmarking the Local Metamodel CMA-ES on the Noiseless BBOB 2013 Test Bed Anne Auger Projet TAO, INRIA Saclay Ile-de-France LRI, Bât 660, Univ. Paris-Sud 91405 Orsay Cedex, France lastname@lri.fr Dimo

More information

ScienceDirect. Testing Optimization Methods on Discrete Event Simulation Models and Testing Functions

ScienceDirect. Testing Optimization Methods on Discrete Event Simulation Models and Testing Functions Available online at www.sciencedirect.com ScienceDirect Procedia Engineering 69 ( 2014 ) 768 777 24th DAAAM International Symposium on Intelligent Manuacturing and Automation, 2013 Testing Optimization

More information

Numerical Optimization: Introduction and gradient-based methods

Numerical Optimization: Introduction and gradient-based methods Numerical Optimization: Introduction and gradient-based methods Master 2 Recherche LRI Apprentissage Statistique et Optimisation Anne Auger Inria Saclay-Ile-de-France November 2011 http://tao.lri.fr/tiki-index.php?page=courses

More information

Mobile Robot Static Path Planning Based on Genetic Simulated Annealing Algorithm

Mobile Robot Static Path Planning Based on Genetic Simulated Annealing Algorithm Mobile Robot Static Path Planning Based on Genetic Simulated Annealing Algorithm Wang Yan-ping 1, Wubing 2 1. School o Electric and Electronic Engineering, Shandong University o Technology, Zibo 255049,

More information

Using Interval Scanning to Improve the Performance of the Enhanced Unidimensional Search Method

Using Interval Scanning to Improve the Performance of the Enhanced Unidimensional Search Method Using Interval Scanning to Improve the Performance of the Enhanced Unidimensional Search Method Mahamed G. H. Omran Department of Computer Science Gulf university for Science & Technology, Kuwait Email:

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2 15-382 COLLECTIVE INTELLIGENCE - S18 LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2 INSTRUCTOR: GIANNI A. DI CARO BACKGROUND: REYNOLDS BOIDS Reynolds created a model of coordinated animal

More information

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm IOSR Journal of Engineering (IOSRJEN) e-issn: 2250-3021, p-issn: 2278-8719 Vol. 3, Issue 10 (October. 2013), V4 PP 09-14 Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

More information

Optimal Design of a Parallel Beam System with Elastic Supports to Minimize Flexural Response to Harmonic Loading

Optimal Design of a Parallel Beam System with Elastic Supports to Minimize Flexural Response to Harmonic Loading 11 th World Congress on Structural and Multidisciplinary Optimisation 07 th -12 th, June 2015, Sydney Australia Optimal Design of a Parallel Beam System with Elastic Supports to Minimize Flexural Response

More information

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization Cell-to-switch assignment in cellular networks using barebones particle swarm optimization Sotirios K. Goudos a), Konstantinos B. Baltzis, Christos Bachtsevanidis, and John N. Sahalos RadioCommunications

More information

Black-box optimization benchmarking of the GLOBAL method

Black-box optimization benchmarking of the GLOBAL method Black-box optimization benchmarking of the GLOBAL method László Pál pallaszlo@sapientia.siculorum.ro Faculty of Business and Humanities, Sapientia Hungarian University of Transylvania, Miercurea-Ciuc,

More information

A Native Approach to Cell to Switch Assignment Using Firefly Algorithm

A Native Approach to Cell to Switch Assignment Using Firefly Algorithm International Journal of Engineering Inventions ISSN: 2278-7461, www.ijeijournal.com Volume 1, Issue 2(September 2012) PP: 17-22 A Native Approach to Cell to Switch Assignment Using Firefly Algorithm Apoorva

More information

Small World Particle Swarm Optimizer for Global Optimization Problems

Small World Particle Swarm Optimizer for Global Optimization Problems Small World Particle Swarm Optimizer for Global Optimization Problems Megha Vora and T.T. Mirnalinee Department of Computer Science and Engineering S.S.N College of Engineering, Anna University, Chennai,

More information

Unidimensional Search for solving continuous high-dimensional optimization problems

Unidimensional Search for solving continuous high-dimensional optimization problems 2009 Ninth International Conference on Intelligent Systems Design and Applications Unidimensional Search for solving continuous high-dimensional optimization problems Vincent Gardeux, Rachid Chelouah,

More information

PARTICLE SWARM OPTIMIZATION (PSO)

PARTICLE SWARM OPTIMIZATION (PSO) PARTICLE SWARM OPTIMIZATION (PSO) J. Kennedy and R. Eberhart, Particle Swarm Optimization. Proceedings of the Fourth IEEE Int. Conference on Neural Networks, 1995. A population based optimization technique

More information

Comparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach

Comparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach Comparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach Hisao Ishibuchi Graduate School of Engineering Osaka Prefecture University Sakai, Osaka 599-853,

More information

Adaptively Choosing Neighbourhood Bests Using Species in a Particle Swarm Optimizer for Multimodal Function Optimization

Adaptively Choosing Neighbourhood Bests Using Species in a Particle Swarm Optimizer for Multimodal Function Optimization Adaptively Choosing Neighbourhood Bests Using Species in a Particle Swarm Optimizer for Multimodal Function Optimization Xiaodong Li School of Computer Science and Information Technology RMIT University,

More information

A Comparative Analysis on the Performance of Particle Swarm Optimization and Artificial Immune Systems for Mathematical Test Functions.

A Comparative Analysis on the Performance of Particle Swarm Optimization and Artificial Immune Systems for Mathematical Test Functions. Australian Journal of Basic and Applied Sciences 3(4): 4344-4350 2009 ISSN 1991-8178 A Comparative Analysis on the Performance of Particle Swarm Optimization and Artificial Immune Systems for Mathematical

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-&3 -(' ( +-   % '.+ % ' -0(+$, The structure is a very important aspect in neural network design, it is not only impossible to determine an optimal structure for a given problem, it is even impossible to prove that a given structure

More information

Automatic differentiation based for particle swarm optimization steepest descent direction

Automatic differentiation based for particle swarm optimization steepest descent direction International Journal of Advances in Intelligent Informatics ISSN: 2442-6571 Vol 1, No 2, July 2015, pp. 90-97 90 Automatic differentiation based for particle swarm optimization steepest descent direction

More information

ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng.

ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng. Copyright 2002 IFAC 5th Triennial World Congress, Barcelona, Spain ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA Mark S. Voss a b and Xin Feng a Department of Civil and Environmental

More information

Research on Image Splicing Based on Weighted POISSON Fusion

Research on Image Splicing Based on Weighted POISSON Fusion Research on Image Splicing Based on Weighted POISSO Fusion Dan Li, Ling Yuan*, Song Hu, Zeqi Wang School o Computer Science & Technology HuaZhong University o Science & Technology Wuhan, 430074, China

More information

Intelligent Optimization Methods for High-Dimensional Data Classification for Support Vector Machines

Intelligent Optimization Methods for High-Dimensional Data Classification for Support Vector Machines Intelligent Inormation Management, 200, 2, ***-*** doi:0.4236/iim.200.26043 Published Online June 200 (http://www.scirp.org/journal/iim) Intelligent Optimization Methods or High-Dimensional Data Classiication

More information

Particle Swarm Optimization

Particle Swarm Optimization Particle Swarm Optimization Gonçalo Pereira INESC-ID and Instituto Superior Técnico Porto Salvo, Portugal gpereira@gaips.inesc-id.pt April 15, 2011 1 What is it? Particle Swarm Optimization is an algorithm

More information

BBOB Black-Box Optimization Benchmarking with COCO (COmparing Continuous Optimizers)

BBOB Black-Box Optimization Benchmarking with COCO (COmparing Continuous Optimizers) BBOB Black-Box Optimization Benchmarking with COCO (COmparing Continuous Optimizers) Black-Box Optimization (Search) Black-Box Optimization (Search) Two objectives: Find solution with a smallest possible

More information

Exploration vs. Exploitation in Differential Evolution

Exploration vs. Exploitation in Differential Evolution Exploration vs. Exploitation in Differential Evolution Ângela A. R. Sá 1, Adriano O. Andrade 1, Alcimar B. Soares 1 and Slawomir J. Nasuto 2 Abstract. Differential Evolution (DE) is a tool for efficient

More information

Performance Assessment of the Hybrid Archive-based Micro Genetic Algorithm (AMGA) on the CEC09 Test Problems

Performance Assessment of the Hybrid Archive-based Micro Genetic Algorithm (AMGA) on the CEC09 Test Problems Perormance Assessment o the Hybrid Archive-based Micro Genetic Algorithm (AMGA) on the CEC9 Test Problems Santosh Tiwari, Georges Fadel, Patrick Koch, and Kalyanmoy Deb 3 Department o Mechanical Engineering,

More information

Piecewise polynomial interpolation

Piecewise polynomial interpolation Chapter 2 Piecewise polynomial interpolation In ection.6., and in Lab, we learned that it is not a good idea to interpolate unctions by a highorder polynomials at equally spaced points. However, it transpires

More information

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization M. Shahab Alam, M. Usman Rafique, and M. Umer Khan Abstract Motion planning is a key element of robotics since it empowers

More information

Cooperative Coevolution using The Brain Storm Optimization Algorithm

Cooperative Coevolution using The Brain Storm Optimization Algorithm Cooperative Coevolution using The Brain Storm Optimization Algorithm Mohammed El-Abd Electrical and Computer Engineering Department American University of Kuwait Email: melabd@auk.edu.kw Abstract The Brain

More information

Particle swarm algorithms for multi-local optimization A. Ismael F. Vaz 1, Edite M.G.P. Fernandes 1

Particle swarm algorithms for multi-local optimization A. Ismael F. Vaz 1, Edite M.G.P. Fernandes 1 I Congresso de Estatística e Investigação Operacional da Galiza e Norte de Portugal VII Congreso Galego de Estatística e Investigación de Operacións Guimarães 26, 27 e 28 de Outubro de 2005 Particle swarm

More information

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation: Convergence of PSO The velocity update equation: v i = v i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) for some values of φ 1 and φ 2 the velocity grows without bound can bound velocity to range [ V max,v

More information

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat An Improved Firefly Algorithm for Optimization Problems Amarita Ritthipakdee 1, Arit Thammano, Nol Premasathian 3, and Bunyarit Uyyanonvara 4 Abstract Optimization problem is one of the most difficult

More information

A Parameter Study for Differential Evolution

A Parameter Study for Differential Evolution A Parameter Study for Differential Evolution ROGER GÄMPERLE SIBYLLE D MÜLLER PETROS KOUMOUTSAKOS Institute of Computational Sciences Department of Computer Science Swiss Federal Institute of Technology

More information

Benchmarking Evolutionary Algorithms: Towards Exploratory Landscape Analysis

Benchmarking Evolutionary Algorithms: Towards Exploratory Landscape Analysis Benchmarking Evolutionary Algorithms: Towards Exploratory Landscape Analysis Olaf Mersmann, Mike Preuss, and Heike Trautmann TU Dortmund University, Dortmund, Germany mike.preuss@cs.uni-dortmund.de {trautmann,olafm}@statistik.tu-dortmund.de

More information

Global Line Search Algorithm Hybridized with Quadratic Interpolation and Its Extension to Separable Functions

Global Line Search Algorithm Hybridized with Quadratic Interpolation and Its Extension to Separable Functions Global Line Search Algorithm Hybridized with Quadratic Interpolation and Its Extension to Separable Functions ABSTRACT Petr Baudiš Czech Technical University in Prague Fac. of Electrical Eng., Dept. of

More information

AN NOVEL NEURAL NETWORK TRAINING BASED ON HYBRID DE AND BP

AN NOVEL NEURAL NETWORK TRAINING BASED ON HYBRID DE AND BP AN NOVEL NEURAL NETWORK TRAINING BASED ON HYBRID DE AND BP Xiaohui Yuan ', Yanbin Yuan 2, Cheng Wang ^ / Huazhong University of Science & Technology, 430074 Wuhan, China 2 Wuhan University of Technology,

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution

More information

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation

More information

Thesis Submitted for the degree of. Doctor of Philosophy. at the University of Leicester. Changhe Li. Department of Computer Science

Thesis Submitted for the degree of. Doctor of Philosophy. at the University of Leicester. Changhe Li. Department of Computer Science Particle Swarm Optimization in Stationary and Dynamic Environments Thesis Submitted for the degree of Doctor of Philosophy at the University of Leicester by Changhe Li Department of Computer Science University

More information

The Modified IWO Algorithm for Optimization of Numerical Functions

The Modified IWO Algorithm for Optimization of Numerical Functions The Modified IWO Algorithm for Optimization of Numerical Functions Daniel Kostrzewa and Henryk Josiński Silesian University of Technology, Akademicka 16 PL-44-100 Gliwice, Poland {Daniel.Kostrzewa,Henryk.Josinski}@polsl.pl

More information

Optimization Using Particle Swarms with Near Neighbor Interactions

Optimization Using Particle Swarms with Near Neighbor Interactions Optimization Using Particle Swarms with Near Neighbor Interactions Kalyan Veeramachaneni, Thanmaya Peram, Chilukuri Mohan, and Lisa Ann Osadciw Department of Electrical Engineering and Computer Science

More information

Package pso. R topics documented: February 20, Version Date Title Particle Swarm Optimization

Package pso. R topics documented: February 20, Version Date Title Particle Swarm Optimization Version 1.0.3 Date 2012-09-02 Title Particle Swarm Optimization Package pso February 20, 2015 Author Claus Bendtsen . Maintainer Claus Bendtsen

More information

Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm

Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm Andreas Janecek and Ying Tan Key Laboratory of Machine Perception (MOE), Peking University Department of Machine Intelligence,

More information

A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

A Comparative Study of Genetic Algorithm and Particle Swarm Optimization IOSR Journal of Computer Engineering (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727 PP 18-22 www.iosrjournals.org A Comparative Study of Genetic Algorithm and Particle Swarm Optimization Mrs.D.Shona 1,

More information

Study on the Development of Complex Network for Evolutionary and Swarm based Algorithms

Study on the Development of Complex Network for Evolutionary and Swarm based Algorithms Study on the Development of Complex Network for Evolutionary and Swarm based Algorithms 1 Roman Senkerik, 2 Ivan Zelinka, 1 Michal Pluhacek and 1 Adam Viktorin 1 Tomas Bata University in Zlin, Faculty

More information

HPSOM: A HYBRID PARTICLE SWARM OPTIMIZATION ALGORITHM WITH GENETIC MUTATION. Received February 2012; revised June 2012

HPSOM: A HYBRID PARTICLE SWARM OPTIMIZATION ALGORITHM WITH GENETIC MUTATION. Received February 2012; revised June 2012 International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 5, May 2013 pp. 1919 1934 HPSOM: A HYBRID PARTICLE SWARM OPTIMIZATION ALGORITHM

More information

Speculative Evaluation in Particle Swarm Optimization

Speculative Evaluation in Particle Swarm Optimization Speculative Evaluation in Particle Swarm Optimization Matthew Gardner, Andrew McNabb, and Kevin Seppi Department of Computer Science, Brigham Young University Abstract. Particle swarm optimization (PSO)

More information

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION Kamil Zakwan Mohd Azmi, Zuwairie Ibrahim and Dwi Pebrianti Faculty of Electrical

More information

Numerical Experiments with a Population Shrinking Strategy within a Electromagnetism-like Algorithm

Numerical Experiments with a Population Shrinking Strategy within a Electromagnetism-like Algorithm Numerical Experiments with a Population Shrinking Strategy within a Electromagnetism-like Algorithm Ana Maria A. C. Rocha and Edite M. G. P. Fernandes Abstract This paper extends our previous work done

More information

Genetic Algorithms Coding Primer

Genetic Algorithms Coding Primer Genetic Algorithms Coding Primer A.I. Khan Lecturer in Computing and Inormation Technology Heriot-Watt University, Riccarton, Edinburgh, EH14 4AS, United Kingdom Introduction In this article the concept

More information

RM-MEDA: A Regularity Model Based Multiobjective Estimation of Distribution Algorithm

RM-MEDA: A Regularity Model Based Multiobjective Estimation of Distribution Algorithm RM-MEDA: A Regularity Model Based Multiobjective Estimation o Distribution Algorithm Qingu Zhang, Aimin Zhou and Yaochu Jin Abstract Under mild conditions, it can be induced rom the Karush-Kuhn-Tucker

More information

Offspring Generation Method using Delaunay Triangulation for Real-Coded Genetic Algorithms

Offspring Generation Method using Delaunay Triangulation for Real-Coded Genetic Algorithms Offspring Generation Method using Delaunay Triangulation for Real-Coded Genetic Algorithms Hisashi Shimosaka 1, Tomoyuki Hiroyasu 2, and Mitsunori Miki 2 1 Graduate School of Engineering, Doshisha University,

More information

Small World Network Based Dynamic Topology for Particle Swarm Optimization

Small World Network Based Dynamic Topology for Particle Swarm Optimization Small World Network Based Dynamic Topology for Particle Swarm Optimization Qingxue Liu 1,2, Barend Jacobus van Wyk 1 1 Department of Electrical Engineering Tshwane University of Technology Pretoria, South

More information

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION International Journal of Computer Engineering and Applications, Volume VIII, Issue I, Part I, October 14 QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION Shradha Chawla 1, Vivek Panwar 2 1 Department

More information

CT79 SOFT COMPUTING ALCCS-FEB 2014

CT79 SOFT COMPUTING ALCCS-FEB 2014 Q.1 a. Define Union, Intersection and complement operations of Fuzzy sets. For fuzzy sets A and B Figure Fuzzy sets A & B The union of two fuzzy sets A and B is a fuzzy set C, written as C=AUB or C=A OR

More information

DIFFERENTIAL EVOLUTION PARTICLE SWARM OPTIMIZATION Nuria Gómez Blas, Alberto Arteta, Luis F. de Mingo

DIFFERENTIAL EVOLUTION PARTICLE SWARM OPTIMIZATION Nuria Gómez Blas, Alberto Arteta, Luis F. de Mingo International Journal "Information Technologies & Knowledge" Vol.5, Number 1, 2011 77 DIFFERENTIAL EVOLUTION PARTICLE SWARM OPTIMIZATION Nuria Gómez Blas, Alberto Arteta, Luis F. de Mingo Abstract: This

More information