An Empirical Comparison of Several Recent Multi-Objective Evolutionary Algorithms
|
|
- Emory Fisher
- 5 years ago
- Views:
Transcription
1 An Empirical Comparison of Several Recent Multi-Objective Evolutionary Algorithms Thomas White 1 and Shan He 1,2 1 School of Computer Science 2 Center for Systems Biology, School of Biological Sciences, The University of Birmingham, Birmingham, B15 2TT, UK Abstract. Many real-world problems can be formulated as multi-objective optimisation problems, in which many potentially conflicting objectives need to be optimized simultaneously. Multi-objective optimisation algorithms based on Evolutionary Algorithms (EAs) such as Genetic Algorithms (GAs) have been proven to be superior to other traditional algorithms such as goal programming. In the past years, several novel Multi-Objective Evolutionary Algorithms (MOEAs) have been proposed. Rather than based on traditional GAs, these algorithms extended other EAs including novel EAs such as Scatter Search and Particle Swarm Optimiser to handle multi-objective problems. However, to the best of our knowledge, there is no fair and systematic comparison of these novel MOEAs. This paper, for the first time, presents the results of an exhaustive performance comparison of an assortment of 5 new and popular algorithms on the DTLZ benchmark functions using a set of well-known performance measures. We also propose a novel performance measure called unique hypervolume, which measures the volume of objective space dominated only by one or more solutions, with respect to a set of solutions. Based on our results, we obtain some important observations on how to choose an appropriate MOA according to the preferences of the user. Keywords: Multi-objective optimisation, Evolutionary Algorithms, Comparison, Genetic Algorithms 1 Introduction Many real-world optimisation problems involve multiple objectives, which are generally incommensurable and often conflicting. These so-called multi-objective optimisation problems are notoriously difficult to solve. In recent years, Multi-Objective Evolutionary Algorithms (MOEAs) have been attracting more attention due to their superior performance over traditional multi-objective optimisation algorithms, in terms of effectiveness and robustness. This general trend is reflected by an exponential increase of MOEA applications to many real-world problems. The most popular MOEAs are algorithms based on Genetic Algorithms (GAs), one notable example being NSGA-II [1]. Recently, several novel MOEAs have been proposed by extending traditional EAs Correspondence and requests for materials should be addressed to Dr. S. He ( s.he@cs.bham.ac.uk)
2 2 Thomas White and Shan He such as Simulated Annealing (SA) or novel EAs such as Scatter Search (SS) and Particle Swarm Optimiser (PSO) to handle multi-objective problems. These novel MOEAs were tested on different set of benchmark functions with varying number of function evaluations. There are some comparison studies that exist for these algorithms, such as [2], in which a set of six representative state-of-the-art multi-objective PSO algorithms were compared. However, to the best of our knowledge, there exists no such fair and systematic comparison of the novel MOEAs this work concerns. In this paper, we tested several representative novel MOEAs due to their recency and reported success. They include AMOSA [3], a multi-objective SA algorithm; OMOPSO [4] and SMPSO, multi-objective PSO algorithms; and AbYSS [5], a multi-objective SS algorithm. For brevity, we roughly outline each algorithm in section 2.1, and refer the reader to the original papers for a more in-depth description. We also select NSGA-II [1] as our baseline algorithm for comparison. Apart from a fair and comprehensive comparison of these four novel MOEAs, another main contribution of this paper is the proposal of a novel performance measure, unique hypervolume. This measures the volume of objective space dominated only by one solution, with respect to a set of solutions. We demonstrate the interesting insights this statistic can provide in the interpretation of algorithm performance results. The remainder of this paper is organized as follows. Section 2 describes the MOEAs, the performance measures and the proposed unique hypervolume. In Section 3, we describe our experiments and discuss the the results. Section 4 concludes the paper. 2 Methods 2.1 Novel MOEAs AMOSA [3] is a multi-objective adaptation of the original SA algorithm. In the single objective case, simulated annealing performs iterative perturbations upon a solution, accepting the change if it is beneficial and rejecting negative changes with a certain probability, where the probability is exponentially reduced as the algorithm progresses. Many significant changes are necessary to adapt this approach to multi-objective problems. The most important change made by this algorithm is the addition of an archive of non-dominated solutions, and the various probabilities regarding acceptance and rejection rely on the domination status of the new solution with respect to the archive. OMOPSO [4] and SMPSO [6] are both multi-objective extensions of PSO, an algorithm inspired by the collective behaviour of social animals, like the flocking of birds. The PSO algorithm maintains a population, or so-called swarm, of particles, each a potential solution of the given problem. These particles move in the search space according to some simple rules, relative to each other, to converge upon a better solution. The multi-objective PSO algorithms OMOPSO and SMPSO both select leaders in the particles that are non-dominated with respect to the swarm. In OMOPSO, crowding distance [1] was used to filter out leader solutions. Two mutation operators are proposed to accelerate the convergence of the OMOPSO algorithm. In SMPSO, a constant σ is assigned to each particle of the swarm and of an external archive then select a particle as the leader of the particles with the closest σ value of the external archive.
3 Title Suppressed Due to Excessive Length 3 AbYSS [5] is a MOA based on the well-known SS algorithm. SS algorithm differs from other EAs by using a small population, or so-called the reference set, in which the individuals are combined to construct new solutions systematically. AbYSS extended the original single-objective SS algorithm by incorporating concepts from the multiobjective field, such as Pareto dominance, density estimation, and an external archive to store the non-dominated solutions. 2.2 Performance measures Measuring and comparing the performance of multi-objective algorithms remains an unsolved problem in the field. The prevailing opinion in the literature is that performance should be measured not only according to how closely the resulting set of solutions converges on the global optimal Pareto front, but also how spread the solutions are across the breadth of the front. An excellent survey of performance measures can be found in [7]. In these paper, we adopt a large set of popular performance measures to evaluate the four novel MOEAs: Epsilon from [8], calculates the minimum size translation in objective values for one set of solutions to cover another reference set. In the case of these experiments, the reference set used was the known globally optimal Pareto front values. The hypervolume indicator, originally described in [9], defines the size of multidimensional space that a set of solutions cover, without counting the same area twice. Spread, as defined in [1], which calculates how uniformly spread across the objective space the solutions are within their set. The cover function from [10], which calculates the proportion of one solution set that is dominated by another set. Thus, if C(X, Y) = 0.9, 90% of solutions in Y are dominated by solutions in X. Note that the converse C(Y, X) must also be considered separately; while most of X could be dominated by a few members Y, most of Y could also be dominated by a few members of X. The Purity function described in [11] unifies the solutions returned by a set of algorithms on a particular problem, and returns the proportion of each that remain nondominated in the set. The unique hypervolume measure, as described in detail below. Unique hypervolume (UH) UH is a property that can be calculated for one solution, with respect to another solution or a set of solutions. Essentially, it quantifies the volume of objective space dominated only by that solution, and not by any other. This idea is represented visually in Fig. 1. UH has the following advantageous properties: It rewards domination proportional to the amount one solution dominates another. It implicitly punishes clustering of solutions. It rewards distinctly original and innovative solutions. It rewards diversity only when this diversity is objectively beneficial, in the context of all other solutions found.
4 4 Thomas White and Shan He Fig. 1. A diagram showing the UH of 4 points on a two-dimensional minimisation problem. Points a and b belong to one set of solutions, denoted by a square and red lines, whereas c and d belong to a second set, denoted by a cross and blue lines. In this diagram, each point dominates the area directly above and to the right of it. The area (or in higher dimensional problems, hypervolume) it alone dominates is its UH. Note that point b dominates point d; therefore point d has no UH, but its presence restricts the UH of point b. Fig. 2. In this diagram, we demonstrate one of the many situations where the results of UH contradict those of original hypervolume. Here we see three sets of results, colour coded as red, blue and purple. The red and blue sets undoubtedly contain more hypervolume than the purple set. However, the lone purple solution is noteworthy in that it covers a more original combination of objective space, making it a rarer and thus more valuable trade-off point. UH bears conceptual similarities to the D metric [12]; a measure we have not actually observed in use. The key differences between the two lie in the behaviour in the case of one solution dominating another, and cases where solutions of the same set are neighbours. In Fig. 3 we demonstrate the differences by illustrating the area calculated by each measure on two sets of Pareto fronts. It can be seen from this diagram that UH is a more difficult metric to score well in. They both share the properties of rewarding useful and diverse solutions; however, UH punishes solutions within a set that are not diverse in comparison to each other. Therefore, we can learn more from this new metric because it also reflects the spread of solutions along the Pareto front.
5 Title Suppressed Due to Excessive Length 5 Fig. 3. A demonstration of the difference between the UH (left) and D metric (right), on two identical sets of Pareto fronts. The UH measure concerns the same property of objective space as the original hypervolume measure, which is notoriously difficult to calculate in an efficient manner. However, our UH measure is comparatively simple to calculate, and does not require a calculation of hypervolume a priori. A simple and efficient method for calculating the UH of a solution is given in Fig. 4. Definitions: let S denote the set of solutions let O denote the set of objectives let side[ S ][ O ] be a two dimensional array rank(s, o) : rank of solution s according to o next(s, o) : next ranked solution according to objective o score(s, o) : objective value of solution s in objective o max(o) : maximum objective value of objective o for each objective o in O sort the set S according to o for each solution s in S if ( rank( s, o ) == S ) side[s][o] = max(o) - score(s, o) else side[s][o] = score(next(s, o), o) - score(s, o) unique_hypervolume(s) = product( side[s] ) Fig. 4. Pseudocode for the calculation of UH. In prose, we simply sort the set of solutions according to each objective, and record the distance from each solution to the next worst in that objective. If the solution is the worst in the set in an objective, we record the distance to the worst possible objective score. The product of these values gives us the volume of a hyperrectangle in objective space, which only that solution has dominated. It should be noted that this method
6 6 Thomas White and Shan He requires knowledge of each maximum objective value, assuming each objective is to be minimised. If this data is not available, an alternative is to substitute the worst value found in each objective. This measure can be used in many ways, such as comparing sets of solutions found by two MOEAs, or finding the most novel solution within a set. In this paper, we can accumulate all the solutions found by every algorithm on a problem into one set, and total the UH of solutions found by each algorithm with respect to that set. 3 Experiments and results The experiments were performed using a customised version of the jmetal framework [13]. This software package already included the algorithms NSGA-II, OMOPSO, SMPSO and AbYSS, to which we added our implementation of AMOSA. jmetal also has a built in set of quality indicators to score the results of a single run of an algorithm on a problem, aggregating those scores to calculate the average for each algorithm, for comparison in that manner. This is very useful, however it does not facilitate comparison measures that require access to the result solutions themselves. For the purposes of this study, and to enable an implementation of UH, the software was modified accordingly. Each algorithm was executed on each of the 7 problems from the Deb, Thiele, Laumanns and Zitzler (DTLZ) problem family [14]. We used the default configurations regarding the number of variables and objectives for each problem within jmetal, along with the default function evaluation budget of 25,000 for those problems. The algorithms are evaluated according to the final set of solutions returned at the end. To ensure reliable and statistically significant results, each such experiment was repeated 30 times. We provide the empirical results of our experiments in the following tables. Note that the best performance in each measure on each problem is shaded a dark grey, and the second best is a lighter grey. Table 1. Epsilon. Mean and standard deviation. DTLZ1 1.50e e e e e e e e e e 01 DTLZ2 1.70e e e e e e e e e e 02 DTLZ3 8.83e e e e e e e e e e+00 DTLZ4 4.39e e e e e e e e e e 01 DTLZ5 4.69e e e e e e e e e e 04 DTLZ6 1.00e e e e e e e e e e 01 DTLZ7 1.52e e e e e e e e e e 01 From the results, it is difficult to select the best method that achieved the best performance across all the 6 performance measures. However, as a baseline algorithm proposed in 2002, NSGA-II performed moderately well on the 7 problems in terms all the measures. Of the 5 MOEAs, SMPSO appeared to be the best algorithm in terms of the distance to the Pareto-optimal front measured by Epsilon: it yields 5 of the best values
7 Title Suppressed Due to Excessive Length 7 Table 2. Hypervolume. Mean and standard deviation. DTLZ1 6.32e e e e e e e e e e 01 DTLZ2 3.45e e e e e e e e e e 03 DTLZ3 3.40e e e e e e e e e e+00 DTLZ4 7.48e e e e e e e e e e 02 DTLZ5 8.08e e e e e e e e e e 05 DTLZ6 7.51e e e e e e e e e e 04 DTLZ7 2.46e e e e e e e e e e 02 Table 3. Spread. Mean and standard deviation. DTLZ1 1.20e e e e e e e e e e 01 DTLZ2 8.34e e e e e e e e e e 02 DTLZ3 1.30e e e e e e e e e e 02 DTLZ4 1.73e e e e e e e e e e 01 DTLZ5 1.04e e e e e e e e e e 02 DTLZ6 1.50e e e e e e e e e e 02 DTLZ7 1.32e e e e e e e e e e 02 Table 4. Coverage. Relating this table to the conventional C(X,Y) notation, the algorithm named in the row heading represents X, and the algorithm named in the column heading replaces Y. For example, C(AMOSA, NSGA2) = on DTLZ1. AMOSA NSGA DTLZ1 OMOPSO SMPSO AbYSS AMOSA NSGA DTLZ2 OMOPSO SMPSO AbYSS AMOSA NSGA DTLZ3 OMOPSO SMPSO AbYSS AMOSA NSGA DTLZ4 OMOPSO SMPSO AbYSS AMOSA NSGA DTLZ5 OMOPSO SMPSO AbYSS AMOSA NSGA DTLZ6 OMOPSO SMPSO AbYSS AMOSA NSGA DTLZ7 OMOPSO SMPSO AbYSS
8 8 Thomas White and Shan He Table 5. Purity. We also report how many solutions each algorithm returned in total, and how many of those remained non-dominated with respect to solutions of other algorithms (thus (X/Y) means it returned Y solutions, of which X were not dominated by any other.) DTLZ (81/2979) (813/3000) 0.0 (0/720) (775/3000) (1638/2997) DTLZ (590/3000) (1399/3000) (13/1074) (374/3000) (2773/3000) DTLZ3 0.0 (0/2960) 0.0 (0/2334) 0.0 (0/686) (1156/2467) 0.0 (0/2802) DTLZ (197/2385) (1433/3000) (11/565) (745/2999) (2788/3000) DTLZ (251/2998) (1039/3000) (5/806) (668/3000) (2552/2998) DTLZ (2937/3000) 0.0 (0/3000) (85/2138) (2416/2982) 0.0 (0/2953) DTLZ (1599/3000) (953/3000) (6/613) (690/2999) (2214/2999) Table 6. Total UH. DTLZ1 2.05e e e e 10 DTLZ2 1.25e e e e e 09 DTLZ e DTLZ e e e e 09 DTLZ5 2.51e e e e e 09 DTLZ6 2.59e e e DTLZ7 7.33e e e e e 09 and 1 second best on of the 7 problems. We can see that SMPSO obtained 3 of the best values and 2 second best in the Hypervolume measure, and it also obtained 1 of the best values and 5 second best values in the Spread measure. The AbYSS algorithm scored the highest purity rating on 5 of the 7 test problems. The coverage results in these cases confirm that it is rare for the solutions of another algorithm to cover more than 2% of its solutions. Curiously, on the remaining 2 problems, all of its solutions were fully dominated by others. Despite having so many nondominated solutions, this did not result in relatively high amounts of UH; information which is open to interpretation. Our opinion is that its frequently poor Spread results are suggestive of tightly clustered solutions on the Pareto front, which UH strongly punishes. In support of this, we can say that on problem 4, when it has the best spread and second best hypervolume, it has the best UH. Interestingly, OMOPSO performed the poorest across all 7 problems in terms of Epsilon, despite regularly scoring the best according to the Spread metric. The purity results for OMOPSO reveal it is consistently yielding less solutions than the other algorithms. Our observation of the inferior performance of OMOPSO compared with other MOEAs is consistent with the results in [2], where the authors found that while OMOPSO outperformed other multi-objective PSO algorithms, second only to SMPSO. The performance of AMOSA is also not satisfactory. In terms of Epsilon and Hypervolume, it only obtained 2 second best values. It is consistently outperformed by the other MOEAs in terms of both Spread and UH. Problem 2 produced an example of the interesting information UH can provide. The best hypervolume score was awarded to AbYSS with 3.82e 01, whereas NSGA- II was a very close second with 3.74e 1. However, NSGA-II found nearly double the amount of UH as AbYSS, a surprising difference given the initial figures. This result also coincides with NSGA-II performing superior to AbYSS on problem 2 in both Spread and Epsilon.
9 Title Suppressed Due to Excessive Length 9 Perhaps the most remarkable set of results emerged on problem 3. Some solutions of SMPSO dominated the solutions found by every other algorithm. The purity measure shows that 1156 of its 2467 total solutions were preserved on the non-dominated front, showing that this was a consistently superior performance, not a single fluke result. The total UH of SMPSO in problem 3 being incredibly low indicates that those 1156 solutions were closely clustered together. Choosing the best performer depends upon the preferences of the user. On the one hand, an indicator that reflects poorly on SMPSO is purity; in this it is often outperformed by AbYSS, NSGA-II and AMOSA. Looking at the coverage results, we can see that it is common for AbYSS or NSGA-II to dominate over 50% of its solutions. Given that it frequently came best or second best in terms of Epsilon, Hypervolume and Spread, we must assume that the few remaining non-dominated solutions are responsible for the significant scores. Thus, if the priority is to find a smaller set of solutions closest to the global optimal Pareto front, SMPSO would be the correct algorithm to select. However, if finding a broader set of trade-off solutions is necessary, AbYSS may be preferred due to its impressive performance in purity. 4 Conclusion In this paper, we conducted a fair and systematic comparison of four representative novel MOEAs proposed in recent years, adopting the most popular MOEA, NSGA-II as the baseline algorithm for comparison. We employed a range of well-known performance measures to evaluate the performance of the 5 MOEAs on the DTLZ problem family. We also proposed a novel performance metric, called unique hypervolume (UH), which can effectively quantify the volume of objective space dominated uniquely by the solution of an algorithm against those of other algorithms. Based on our results, we observed that for finding a smaller set of good solutions, SMPSO would be the best choice. If we are more interested in discovering the trade-off region in multi-objective problems, AbYSS is preferred. In terms of future work, due to the interesting insight into the results gained through the measurement of total unique hypervolume, especially its ability to quantify the novelty of solution within a set, we intend to explore this idea further. More specifically, we will be using the measure to observe performance of multi-objective optimisation algorithms on design problems, where innovation is particularly valued. Acknowledgment Mr Thomas White and Dr Shan He are supported by EPSRC (EP/J01446X/1). References 1. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: Nsga-ii. Evolutionary Computation, IEEE Transactions on 6(2) (2002)
10 10 Thomas White and Shan He 2. Durillo, J.J., García-Nieto, J., Nebro, A.J., Coello, C.A., Luna, F., Alba, E.: Multi-objective particle swarm optimizers: An experimental comparison. In: Proceedings of the 5th International Conference on Evolutionary Multi-Criterion Optimization. EMO 09, Berlin, Heidelberg, Springer-Verlag (2009) Bandyopadhyay, S., Saha, S., Maulik, U., Deb, K.: A simulated annealing-based multiobjective optimization algorithm: Amosa. Evolutionary Computation, IEEE Transactions on 12(3) (2008) Reyes, M., Coello Coello, C.: Improving pso-based multi-objective optimization using crowding, mutation and ɛ-dominance. In Coello, C., Hernández, A., Zitler, E., eds.: Third International Conference on Evolutionary MultiCriterion Optimization, EMO Volume 3410 of LNCS., Springer (2005) Nebro, A.J., Luna, F., Alba, E., Dorronsoro, B., Durillo, J.J., Beham, A.: AbYSS: Adapting Scatter Search to Multiobjective Optimization. IEEE Transactions on Evolutionary Computation 12(4) (August 2008) 6. Nebro, A., Durillo, J., García-Nieto, J., Coello Coello, C., Luna, F., Alba, E.: Smpso: A new pso-based metaheuristic for multi-objective optimization. In: 2009 IEEE Symposium on Computational Intelligence in Multicriteria Decision-Making (MCDM 2009), IEEE Press (2009) Okabe, T., Jin, Y., Sendhoff, B.: A critical survey of performance indices for multi-objective optimisation. In: Evolutionary Computation, CEC 03. The 2003 Congress on. Volume 2., IEEE (2003) Fonseca, C., Knowles, J., Thiele, L., Zitzler, E.: A tutorial on the performance assessment of stochastic multiobjective optimizers. In: Third International Conference on Evolutionary Multi-Criterion Optimization (EMO 2005). Volume 216. (2005) 9. Zitzler, E., Thiele, L.: Multiobjective evolutionary algorithms: A comparative case study and the strength pareto approach. Evolutionary Computation, IEEE Transactions on 3(4) (1999) Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: Empirical results. Evolutionary computation 8(2) (2000) Bandyopadhyay, S., Pal, S., Aruna, B.: Multiobjective gas, quantitative indices, and pattern classification. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on 34(5) (2004) Zitzler, E.: Evolutionary algorithms for multiobjective optimization: Methods and applications. Shaker (1999) 13. Durillo, J.J., Nebro, A.J.: jmetal: A java framework for multi-objective optimization. Advances in Engineering Software 42(10) (2011) Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable multi-objective optimization test problems. In: Proceedings of the Congress on Evolutionary Computation (CEC- 2002),(Honolulu, USA). (2002)
A Distance Metric for Evolutionary Many-Objective Optimization Algorithms Using User-Preferences
A Distance Metric for Evolutionary Many-Objective Optimization Algorithms Using User-Preferences Upali K. Wickramasinghe and Xiaodong Li School of Computer Science and Information Technology, RMIT University,
More informationExperimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization
Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization adfa, p. 1, 2011. Springer-Verlag Berlin Heidelberg 2011 Devang Agarwal and Deepak Sharma Department of Mechanical
More informationScienceDirect. Differential Search Algorithm for Multiobjective Problems
Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 48 (2015 ) 22 28 International Conference on Intelligent Computing, Communication & Convergence (ICCC-2015) (ICCC-2014)
More informationSolving Multi-objective Optimisation Problems Using the Potential Pareto Regions Evolutionary Algorithm
Solving Multi-objective Optimisation Problems Using the Potential Pareto Regions Evolutionary Algorithm Nasreddine Hallam, Graham Kendall, and Peter Blanchfield School of Computer Science and IT, The Univeristy
More informationIncorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms
H. Ishibuchi, T. Doi, and Y. Nojima, Incorporation of scalarizing fitness functions into evolutionary multiobjective optimization algorithms, Lecture Notes in Computer Science 4193: Parallel Problem Solving
More informationReference Point-Based Particle Swarm Optimization Using a Steady-State Approach
Reference Point-Based Particle Swarm Optimization Using a Steady-State Approach Richard Allmendinger,XiaodongLi 2,andJürgen Branke University of Karlsruhe, Institute AIFB, Karlsruhe, Germany 2 RMIT University,
More informationLamarckian Repair and Darwinian Repair in EMO Algorithms for Multiobjective 0/1 Knapsack Problems
Repair and Repair in EMO Algorithms for Multiobjective 0/ Knapsack Problems Shiori Kaige, Kaname Narukawa, and Hisao Ishibuchi Department of Industrial Engineering, Osaka Prefecture University, - Gakuen-cho,
More informationRecombination of Similar Parents in EMO Algorithms
H. Ishibuchi and K. Narukawa, Recombination of parents in EMO algorithms, Lecture Notes in Computer Science 341: Evolutionary Multi-Criterion Optimization, pp. 265-279, Springer, Berlin, March 25. (Proc.
More informationPerformance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances
Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances Minzhong Liu, Xiufen Zou, Yu Chen, Zhijian Wu Abstract In this paper, the DMOEA-DD, which is an improvement of DMOEA[1,
More informationPerformance Evaluation of Vector Evaluated Gravitational Search Algorithm II
170 New Trends in Software Methodologies, Tools and Techniques H. Fujita et al. (Eds.) IOS Press, 2014 2014 The authors and IOS Press. All rights reserved. doi:10.3233/978-1-61499-434-3-170 Performance
More informationApproximation-Guided Evolutionary Multi-Objective Optimization
Approximation-Guided Evolutionary Multi-Objective Optimization Karl Bringmann 1, Tobias Friedrich 1, Frank Neumann 2, Markus Wagner 2 1 Max-Planck-Institut für Informatik, Campus E1.4, 66123 Saarbrücken,
More informationDEMO: Differential Evolution for Multiobjective Optimization
DEMO: Differential Evolution for Multiobjective Optimization Tea Robič and Bogdan Filipič Department of Intelligent Systems, Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana, Slovenia tea.robic@ijs.si
More informationImproved Pruning of Non-Dominated Solutions Based on Crowding Distance for Bi-Objective Optimization Problems
Improved Pruning of Non-Dominated Solutions Based on Crowding Distance for Bi-Objective Optimization Problems Saku Kukkonen and Kalyanmoy Deb Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute
More informationR2-IBEA: R2 Indicator Based Evolutionary Algorithm for Multiobjective Optimization
R2-IBEA: R2 Indicator Based Evolutionary Algorithm for Multiobjective Optimization Dũng H. Phan Department of Computer Science University of Massachusetts, Boston Boston, MA 02125, USA Email: phdung@cs.umb.edu
More informationMulti-objective Optimization
Jugal K. Kalita Single vs. Single vs. Single Objective Optimization: When an optimization problem involves only one objective function, the task of finding the optimal solution is called single-objective
More informationAdaptive Multi-objective Particle Swarm Optimization Algorithm
Adaptive Multi-objective Particle Swarm Optimization Algorithm P. K. Tripathi, Sanghamitra Bandyopadhyay, Senior Member, IEEE and S. K. Pal, Fellow, IEEE Abstract In this article we describe a novel Particle
More informationMulti-objective Optimization Algorithm based on Magnetotactic Bacterium
Vol.78 (MulGrab 24), pp.6-64 http://dx.doi.org/.4257/astl.24.78. Multi-obective Optimization Algorithm based on Magnetotactic Bacterium Zhidan Xu Institute of Basic Science, Harbin University of Commerce,
More informationParallel Multi-objective Optimization using Master-Slave Model on Heterogeneous Resources
Parallel Multi-objective Optimization using Master-Slave Model on Heterogeneous Resources Sanaz Mostaghim, Jürgen Branke, Andrew Lewis, Hartmut Schmeck Abstract In this paper, we study parallelization
More informationMLPSO: MULTI-LEADER PARTICLE SWARM OPTIMIZATION FOR MULTI-OBJECTIVE OPTIMIZATION PROBLEMS
MLPSO: MULTI-LEADER PARTICLE SWARM OPTIMIZATION FOR MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Zuwairie Ibrahim 1, Kian Sheng Lim 2, Salinda Buyamin 2, Siti Nurzulaikha Satiman 1, Mohd Helmi Suib 1, Badaruddin
More informationUsing ɛ-dominance for Hidden and Degenerated Pareto-Fronts
IEEE Symposium Series on Computational Intelligence Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts Heiner Zille Institute of Knowledge and Language Engineering University of Magdeburg, Germany
More informationFinding Sets of Non-Dominated Solutions with High Spread and Well-Balanced Distribution using Generalized Strength Pareto Evolutionary Algorithm
16th World Congress of the International Fuzzy Systems Association (IFSA) 9th Conference of the European Society for Fuzzy Logic and Technology (EUSFLAT) Finding Sets of Non-Dominated Solutions with High
More informationParallel Multi-objective Optimization using Master-Slave Model on Heterogeneous Resources
Parallel Multi-objective Optimization using Master-Slave Model on Heterogeneous Resources Author Mostaghim, Sanaz, Branke, Jurgen, Lewis, Andrew, Schmeck, Hartmut Published 008 Conference Title IEEE Congress
More informationUsing Different Many-Objective Techniques in Particle Swarm Optimization for Many Objective Problems: An Empirical Study
International Journal of Computer Information Systems and Industrial Management Applications ISSN 2150-7988 Volume 3 (2011) pp.096-107 MIR Labs, www.mirlabs.net/ijcisim/index.html Using Different Many-Objective
More informationParticle Swarm Optimization to Solve Optimization Problems
Particle Swarm Optimization to Solve Optimization Problems Gregorio Toscano-Pulido and Carlos A. Coello Coello Evolutionary Computation Group at CINVESTAV-IPN (EVOCINV) Electrical Eng. Department, Computer
More informationMultiobjective Optimisation. Why? Panorama. General Formulation. Decision Space and Objective Space. 1 of 7 02/03/15 09:49.
ITNPD8/CSCU9YO Multiobjective Optimisation An Overview Nadarajen Veerapen (nve@cs.stir.ac.uk) University of Stirling Why? Classic optimisation: 1 objective Example: Minimise cost Reality is often more
More informationProceedings of the 2014 Winter Simulation Conference A. Tolk, S. Y. Diallo, I. O. Ryzhov, L. Yilmaz, S. Buckley, and J. A. Miller, eds.
Proceedings of the 2014 Winter Simulation Conference A. Tolk, S. Y. Diallo, I. O. Ryzhov, L. Yilmaz, S. Buckley, and J. A. Miller, eds. A STUDY ON MULTI-OBJECTIVE PARTICLE SWARM OPTIMIZATION WITH WEIGHTED
More informationOptimizing Delivery Time in Multi-Objective Vehicle Routing Problems with Time Windows
Optimizing Delivery Time in Multi-Objective Vehicle Routing Problems with Time Windows Abel Garcia-Najera and John A. Bullinaria School of Computer Science, University of Birmingham Edgbaston, Birmingham
More informationMutation Operators based on Variable Grouping for Multi-objective Large-scale Optimization
Mutation Operators based on Variable Grouping for Multi-objective Large-scale Optimization Heiner Zille, Hisao Ishibuchi, Sanaz Mostaghim and Yusuke Nojima Institute for Intelligent Cooperating Systems
More informationEVOLUTIONARY algorithms (EAs) are a class of
An Investigation on Evolutionary Gradient Search for Multi-objective Optimization C. K. Goh, Y. S. Ong and K. C. Tan Abstract Evolutionary gradient search is a hybrid algorithm that exploits the complementary
More informationA Predictive Pareto Dominance Based Algorithm for Many-Objective Problems
10 th World Congress on Structural and Multidisciplinary Optimization May 19-24, 2013, Orlando, Florida, USA A Predictive Pareto Dominance Based Algorithm for Many-Objective Problems Edgar Galvan 1, Erin
More informationExploration of Pareto Frontier Using a Fuzzy Controlled Hybrid Line Search
Seventh International Conference on Hybrid Intelligent Systems Exploration of Pareto Frontier Using a Fuzzy Controlled Hybrid Line Search Crina Grosan and Ajith Abraham Faculty of Information Technology,
More informationEvolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.
Evolutionary Algorithms: Lecture 4 Jiří Kubaĺık Department of Cybernetics, CTU Prague http://labe.felk.cvut.cz/~posik/xe33scp/ pmulti-objective Optimization :: Many real-world problems involve multiple
More informationGECCO 2007 Tutorial / Evolutionary Multiobjective Optimization. Eckart Zitzler ETH Zürich. weight = 750g profit = 5.
Tutorial / Evolutionary Multiobjective Optimization Tutorial on Evolutionary Multiobjective Optimization Introductory Example: The Knapsack Problem weight = 75g profit = 5 weight = 5g profit = 8 weight
More informationComparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach
Comparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach Hisao Ishibuchi Graduate School of Engineering Osaka Prefecture University Sakai, Osaka 599-853,
More informationGeneralized Multiobjective Evolutionary Algorithm Guided by Descent Directions
DOI.7/s852-4-9255-y Generalized Multiobjective Evolutionary Algorithm Guided by Descent Directions Roman Denysiuk Lino Costa Isabel Espírito Santo Received: August 23 / Accepted: 9 March 24 Springer Science+Business
More informationA Fast Approximation-Guided Evolutionary Multi-Objective Algorithm
A Fast Approximation-Guided Evolutionary Multi-Objective Algorithm Markus Wagner and Frank Neumann Evolutionary Computation Group School of Computer Science The University of Adelaide Adelaide, SA 5005,
More informationA Similarity-Based Mating Scheme for Evolutionary Multiobjective Optimization
A Similarity-Based Mating Scheme for Evolutionary Multiobjective Optimization Hisao Ishibuchi and Youhei Shibata Department of Industrial Engineering, Osaka Prefecture University, - Gakuen-cho, Sakai,
More informationFinding Knees in Multi-objective Optimization
Finding Knees in Multi-objective Optimization Jürgen Branke 1, Kalyanmoy Deb 2, Henning Dierolf 1, and Matthias Osswald 1 1 Institute AIFB, University of Karlsruhe, Germany branke@aifb.uni-karlsruhe.de
More informationEffectiveness and efficiency of non-dominated sorting for evolutionary multi- and many-objective optimization
Complex Intell. Syst. (217) 3:247 263 DOI 1.17/s4747-17-57-5 ORIGINAL ARTICLE Effectiveness and efficiency of non-dominated sorting for evolutionary multi- and many-objective optimization Ye Tian 1 Handing
More informationMechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA
Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA Kalyanmoy Deb, Amrit Pratap, and Subrajyoti Moitra Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute
More informationPaDe: A Parallel Algorithm Based on the MOEA/D Framework and the Island Model
PaDe: A Parallel Algorithm Based on the MOEA/D Framework and the Island Model Andrea Mambrini 1, and Dario Izzo 2 1 University of Birmingham, Birmingham, UK 2 European Space Agency, Noordwijk, The Netherlands
More informationIncrementally Maximising Hypervolume for Selection in Multi-objective Evolutionary Algorithms
Incrementally Maximising Hypervolume for Selection in Multi-objective Evolutionary Algorithms Lucas Bradstreet, Student Member, IEEE, Lyndon While, Senior Member, IEEE, and Luigi Barone, Member, IEEE Abstract
More informationFinding a preferred diverse set of Pareto-optimal solutions for a limited number of function calls
Finding a preferred diverse set of Pareto-optimal solutions for a limited number of function calls Florian Siegmund, Amos H.C. Ng Virtual Systems Research Center University of Skövde P.O. 408, 541 48 Skövde,
More informationEvolutionary Computation
Evolutionary Computation Lecture 9 Mul+- Objec+ve Evolu+onary Algorithms 1 Multi-objective optimization problem: minimize F(X) = ( f 1 (x),..., f m (x)) The objective functions may be conflicting or incommensurable.
More informationInvestigating the Effect of Parallelism in Decomposition Based Evolutionary Many-Objective Optimization Algorithms
Investigating the Effect of Parallelism in Decomposition Based Evolutionary Many-Objective Optimization Algorithms Lei Chen 1,2, Kalyanmoy Deb 2, and Hai-Lin Liu 1 1 Guangdong University of Technology,
More informationAn Evolutionary Multi-Objective Crowding Algorithm (EMOCA): Benchmark Test Function Results
Syracuse University SURFACE Electrical Engineering and Computer Science College of Engineering and Computer Science -0-005 An Evolutionary Multi-Objective Crowding Algorithm (EMOCA): Benchmark Test Function
More informationAdjusting Parallel Coordinates for Investigating Multi-Objective Search
Adjusting Parallel Coordinates for Investigating Multi-Objective Search Liangli Zhen,, Miqing Li, Ran Cheng, Dezhong Peng and Xin Yao 3, Machine Intelligence Laboratory, College of Computer Science, Sichuan
More informationEvolving SQL Queries for Data Mining
Evolving SQL Queries for Data Mining Majid Salim and Xin Yao School of Computer Science, The University of Birmingham Edgbaston, Birmingham B15 2TT, UK {msc30mms,x.yao}@cs.bham.ac.uk Abstract. This paper
More informationTrade-off Between Computational Complexity and Accuracy in Evolutionary Image Feature Extraction
Trade-off Between Computational Complexity and Accuracy in Evolutionary Image Feature Extraction Wissam A. Albukhanajer, Yaochu Jin and Johann A. Briffa Wissam A. Albukhanajer (student) E: w.albukhanajer@surrey.ac.uk
More informationHybrid Genetic Algorithms for Multi-objective Optimisation of Water Distribution Networks
Hybrid Genetic Algorithms for Multi-objective Optimisation of Water Distribution Networks Edward Keedwell and Soon-Thiam Khu Centre for Water Systems, School of Engineering and Computer Science and Mathematics,
More informationMultiobjective Prototype Optimization with Evolved Improvement Steps
Multiobjective Prototype Optimization with Evolved Improvement Steps Jiri Kubalik 1, Richard Mordinyi 2, and Stefan Biffl 3 1 Department of Cybernetics Czech Technical University in Prague Technicka 2,
More informationHigh-Dimensional Multi-objective Optimization Using Co-operative Vector-Evaluated Particle Swarm Optimization With Random Variable Grouping
2015 IEEE Symposium Series on Computational Intelligence High-Dimensional Multi-objective Optimization Using Co-operative Vector-Evaluated Particle Swarm Optimization With Random Variable Grouping Justin
More informationStochastic decision optimisation based on deterministic approximations of processes described as closed-form arithmetic simulation
Journal of Decision Systems ISSN: 1246-0125 (Print) 2116-7052 (Online) Journal homepage: http://www.tandfonline.com/loi/tjds20 Stochastic decision optimisation based on deterministic approximations of
More informationKursawe Function Optimisation using Hybrid Micro Genetic Algorithm (HMGA)
Kursawe Function Optimisation using Hybrid Micro Genetic Algorithm (HMGA) Lim Wei Jer 1, Asral Bahari Jambek 1, and Neoh Siew Chin 2 1 School of Microelectronic Engineering, Universiti Malaysia Perlis,
More informationEffects of Discrete Design-variable Precision on Real-Coded Genetic Algorithm
Effects of Discrete Design-variable Precision on Real-Coded Genetic Algorithm Toshiki Kondoh, Tomoaki Tatsukawa, Akira Oyama, Takeshi Watanabe and Kozo Fujii Graduate School of Engineering, Tokyo University
More informationAvailable online at ScienceDirect. Procedia Computer Science 60 (2015 )
Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 60 (2015 ) 178 187 19th International Conference on Knowledge Based and Intelligent Information and Engineering Systems
More informationMulti-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design
City University of New York (CUNY) CUNY Academic Works International Conference on Hydroinformatics 8-1-2014 Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design Matthew
More informationTHE NEW HYBRID COAW METHOD FOR SOLVING MULTI-OBJECTIVE PROBLEMS
THE NEW HYBRID COAW METHOD FOR SOLVING MULTI-OBJECTIVE PROBLEMS Zeinab Borhanifar and Elham Shadkam * Department of Industrial Engineering, Faculty of Eng.; Khayyam University, Mashhad, Iran ABSTRACT In
More informationIEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 12, NO. 4, AUGUST
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 12, NO. 4, AUGUST 2008 439 AbYSS: Adapting Scatter Search to Multiobjective Optimization Antonio J. Nebro, Francisco Luna, Student Member, IEEE, Enrique
More informationApproximation Model Guided Selection for Evolutionary Multiobjective Optimization
Approximation Model Guided Selection for Evolutionary Multiobjective Optimization Aimin Zhou 1, Qingfu Zhang 2, and Guixu Zhang 1 1 Each China Normal University, Shanghai, China 2 University of Essex,
More informationAdaptive Reference Vector Generation for Inverse Model Based Evolutionary Multiobjective Optimization with Degenerate and Disconnected Pareto Fronts
Adaptive Reference Vector Generation for Inverse Model Based Evolutionary Multiobjective Optimization with Degenerate and Disconnected Pareto Fronts Ran Cheng, Yaochu Jin,3, and Kaname Narukawa 2 Department
More informationDerivative-Free Optimization: Lifting Single-Objective to Multi-Objective Algorithm
Derivative-Free Optimization: Lifting Single-Objective to Multi-Objective Algorithm Cyrille Dejemeppe, Pierre Schaus, and Yves Deville ICTEAM, Université Catholique de Louvain (UCLouvain), Belgium, {cyrille.dejemeppe,
More informationMultiobjective Optimization Using Adaptive Pareto Archived Evolution Strategy
Multiobjective Optimization Using Adaptive Pareto Archived Evolution Strategy Mihai Oltean Babeş-Bolyai University Department of Computer Science Kogalniceanu 1, Cluj-Napoca, 3400, Romania moltean@cs.ubbcluj.ro
More informationIEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO., MONTH YEAR 1
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO., MONTH YEAR 1 An Efficient Approach to Non-dominated Sorting for Evolutionary Multi-objective Optimization Xingyi Zhang, Ye Tian, Ran Cheng, and
More informationIndicator-Based Selection in Multiobjective Search
Indicator-Based Selection in Multiobjective Search Eckart Zitzler and Simon Künzli Swiss Federal Institute of Technology Zurich Computer Engineering and Networks Laboratory (TIK) Gloriastrasse 35, CH 8092
More informationMULTI-OBJECTIVE GENETIC LOCAL SEARCH ALGORITHM FOR SUPPLY CHAIN SIMULATION OPTIMISATION
MULTI-OBJECTIVE GENETIC LOCAL SEARCH ALGORITHM FOR SUPPLY CHAIN SIMULATION OPTIMISATION Galina Merkuryeva (a), Liana Napalkova (b) (a) (b) Department of Modelling and Simulation, Riga Technical University,
More informationIncorporating Decision-Maker Preferences into the PADDS Multi- Objective Optimization Algorithm for the Design of Water Distribution Systems
Incorporating Decision-Maker Preferences into the PADDS Multi- Objective Optimization Algorithm for the Design of Water Distribution Systems Bryan A. Tolson 1, Mohammadamin Jahanpour 2 1,2 Department of
More informationCHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION
131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This
More informationminimizing minimizing
The Pareto Envelope-based Selection Algorithm for Multiobjective Optimization David W. Corne, Joshua D. Knowles, Martin J. Oates School of Computer Science, Cybernetics and Electronic Engineering University
More informationCritical Comparison of Multi-objective Optimization Methods: Genetic Algorithms versus Swarm Intelligence
RADIOENGINEERING, VOL. 9, NO., SEPTEMBER 9 Critical Comparison of Multi-objective Optimization Methods: Genetic Algorithms versus Swarm Intelligence Vladimír ŠEDĚNKA, Zbyněk RAIDA Dept. of Radio Electronics,
More informationA Clustering Multi-objective Evolutionary Algorithm Based on Orthogonal and Uniform Design
A Clustering Multi-objective Evolutionary Algorithm Based on Orthogonal and Uniform Design Yuping Wang, Chuangyin Dang, Hecheng Li, Lixia Han and Jingxuan Wei Abstract Designing efficient algorithms for
More informationDiscovering and Navigating a Collection of Process Models using Multiple Quality Dimensions
Discovering and Navigating a Collection of Process Models using Multiple Quality Dimensions J.C.A.M. Buijs, B.F. van Dongen, and W.M.P. van der Aalst Eindhoven University of Technology, The Netherlands
More informationVisualization of Pareto-Sets in Evolutionary Multi-Objective Optimization
Visualization of Pareto-Sets in Evolutionary Multi-Objective Optimization Mario Köppen, Kaori Yoshida Kyushu Institute of Technology Dept. Artificial Intelligence 680-04 Kawazu, Iizuka, Fukuoka 820-8502,
More informationFuzzy-Pareto-Dominance and its Application in Evolutionary Multi-Objective Optimization
Fuzzy-Pareto-Dominance and its Application in Evolutionary Multi-Objective Optimization Mario Köppen, Raul Vicente-Garcia, and Bertram Nickolay Fraunhofer IPK, Pascalstr. 8-9, 10587 Berlin, Germany {mario.koeppen
More informationAn External Archive Guided Multiobjective Evolutionary Approach Based on Decomposition for Continuous Optimization
IEEE Congress on Evolutionary Computation (CEC) July -,, Beijing, China An External Archive Guided Multiobjective Evolutionary Approach Based on Decomposition for Continuous Optimization Yexing Li School
More informationOn Asynchronous Non-Dominated Sorting for Steady-State Multiobjective Evolutionary Algorithms
On Asynchronous Non-Dominated Sorting for Steady-State Multiobjective Evolutionary Algorithms arxiv:1804.05208v1 [cs.ds] 14 Apr 2018 Ilya Yakupov July 5, 2018 Abstract Maxim Buzdalov In parallel and distributed
More informationEvolving Human Competitive Research Spectra-Based Note Fault Localisation Techniques
UCL DEPARTMENT OF COMPUTER SCIENCE Research Note RN/12/03 Evolving Human Competitive Research Spectra-Based Note Fault Localisation Techniques RN/17/07 Deep Parameter Optimisation for Face Detection Using
More informationArtificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems
Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Dervis Karaboga and Bahriye Basturk Erciyes University, Engineering Faculty, The Department of Computer
More informationVery Fast Non-Dominated Sorting
Decision Making in Manufacturing and Services Vol. 8 2014 No. 1 2 pp. 13 23 Very Fast Non-Dominated Sorting Czesław Smutnicki, Jarosław Rudy, Dominik Żelazny Abstract. A new and very efficient parallel
More informationA gradient-based multiobjective optimization technique using an adaptive weighting method
10 th World Congress on Structural and Multidisciplinary Optimization May 19-24, 2013, Orlando, Florida, USA A gradient-based multiobjective optimization technique using an adaptive weighting method Kazuhiro
More informationCommunication Strategies in Distributed Evolutionary Algorithms for Multi-objective Optimization
CONTI 2006 The 7 th INTERNATIONAL CONFERENCE ON TECHNICAL INFORMATICS, 8-9 June 2006, TIMISOARA, ROMANIA Communication Strategies in Distributed Evolutionary Algorithms for Multi-objective Optimization
More informationEFFECTIVE CONCURRENT ENGINEERING WITH THE USAGE OF GENETIC ALGORITHMS FOR SOFTWARE DEVELOPMENT
EFFECTIVE CONCURRENT ENGINEERING WITH THE USAGE OF GENETIC ALGORITHMS FOR SOFTWARE DEVELOPMENT D.Sundar 1, Dr.K.Alagarsamy 2 1 Assistant Professor, Thiagarajar School of Management, Madurai, India 2 Associate
More informationImpact of Voltage Levels Number for Energy-aware Bi-objective DAG Scheduling for Multi-processors Systems
Impact of Voltage Levels Number for Energy-aware Bi-objective DAG Scheduling for Multi-processors Systems Mateusz Guzek 1, Cesar O. Diaz 2, Johnatan E. Pecero 2, Pascal Bouvry 2, Albert Y. Zomaya 3 1 Interdisciplinary
More informationHandling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization
Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Richa Agnihotri #1, Dr. Shikha Agrawal #1, Dr. Rajeev Pandey #1 # Department of Computer Science Engineering, UIT,
More informationA Parallel Multi-Objective Cooperative Coevolutionary Algorithm for Optimising Small-World Properties in VANETs
A Parallel Multi-Objective Cooperative Coevolutionary Algorithm for Optimising Small-World Properties in VANETs Grégoire Danoy, Julien Schleich, Pascal Bouvry Computer Science and Communications Research
More informationA novel Ranking-based Optimal Guides Selection Strategy in MOPSO
Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 9 ( ) Information Technology and Quantitative Management (ITQM ) A novel Ranking-based Optimal Guides Selection Strategy
More informationA Multi-Tier Adaptive Grid Algorithm for the Evolutionary Multi-Objective Optimisation of Complex Problems
Soft Computing manuscript No. (will be inserted by the editor) A Multi-Tier Adaptive Grid Algorithm for the Evolutionary Multi-Objective Optimisation of Complex Problems Shahin Rostami Alex Shenfield Received:
More informationAsoftware development process typically consists of four
IEEE TRANSACTIONS ON RELIABILITY, VOL 59, NO 3, SEPTEMBER 2010 563 Multi-Objective Approaches to Optimal Testing Resource Allocation in Modular Software Systems Zai Wang, Student Member, IEEE, Ke Tang,
More informationMetaheuristics for the Bi-objective Ring Star Problem
Metaheuristics for the Bi-objective Ring Star Problem Arnaud Liefooghe 1, Laetitia Jourdan 1, Matthieu Basseur 2, El-Ghazali Talbi 1, and Edmund K. Burke 2 1 INRIA-Futurs, LIFL, CNRS Université des Sciences
More informationMOCell: A Cellular Genetic Algorithm for Multiobjective Optimization
MOCell: A Cellular Genetic Algorithm for Multiobjective Optimization Antonio J. Nebro, Juan J. Durillo, Francisco Luna, Bernabé Dorronsoro, Enrique Alba Departamento de Lenguajes y Ciencias de la Computación,
More informationEliteNSGA-III: An Improved Evolutionary Many- Objective Optimization Algorithm
EliteNSGA-III: An Improved Evolutionary Many- Objective Optimization Algorithm Amin Ibrahim, IEEE Member Faculty of Electrical, Computer, and Software Engineering University of Ontario Institute of Technology
More informationDevelopment of Evolutionary Multi-Objective Optimization
A. Mießen Page 1 of 13 Development of Evolutionary Multi-Objective Optimization Andreas Mießen RWTH Aachen University AVT - Aachener Verfahrenstechnik Process Systems Engineering Turmstrasse 46 D - 52056
More informationDeconstructing Multi-objective Evolutionary Algorithms: An Iterative Analysis on the Permutation Flow-Shop Problem
Deconstructing Multi-objective Evolutionary Algorithms: An Iterative Analysis on the Permutation Flow-Shop Problem Leonardo C. T. Bezerra, Manuel López-Ibáñez, and Thomas Stützle IRIDIA, Université Libre
More informationGPU-Based Parallel Multi-objective Particle Swarm Optimization
International Journal of Artificial Intelligence, ISSN 974-635; Int. J. Artif. Intell. Autumn (October) 2, Volume 7, Number A Copyright 2 by IJAI (CESER Publications) GPU-Based Parallel Multi-objective
More informationIndicator-Based Multi-Objective Local Search
ndicator-based Multi-Objective Local Search M. Basseur and E. K. Burke Abstract This paper presents a simple and generic indicator-based multi-objective local search. This algorithm is a direct extension
More informationLate Parallelization and Feedback Approaches for Distributed Computation of Evolutionary Multiobjective Optimization Algorithms
Late arallelization and Feedback Approaches for Distributed Computation of Evolutionary Multiobjective Optimization Algorithms O. Tolga Altinoz Department of Electrical and Electronics Engineering Ankara
More informationUnsupervised Feature Selection Using Multi-Objective Genetic Algorithms for Handwritten Word Recognition
Unsupervised Feature Selection Using Multi-Objective Genetic Algorithms for Handwritten Word Recognition M. Morita,2, R. Sabourin 3, F. Bortolozzi 3 and C. Y. Suen 2 École de Technologie Supérieure, Montreal,
More informationEfficient Non-domination Level Update Approach for Steady-State Evolutionary Multiobjective Optimization
Efficient Non-domination Level Update Approach for Steady-State Evolutionary Multiobjective Optimization Ke Li 1, Kalyanmoy Deb 1, Qingfu Zhang 2, and Sam Kwong 2 1 Department of Electrical and Computer
More informationA Multi-Objective Approach for QoS-aware Service Composition
A Multi-Objective Approach for QoS-aware Service Composition Marcel Cremene, Mihai Suciu, Florin-Claudiu Pop, Denis Pallez and D. Dumitrescu Technical University of Cluj-Napoca, Romania Babes-Bolyai University
More informationNCGA : Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems
: Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems Shinya Watanabe Graduate School of Engineering, Doshisha University 1-3 Tatara Miyakodani,Kyo-tanabe, Kyoto, 10-031,
More information