Recombination of Similar Parents in EMO Algorithms

Similar documents
A Similarity-Based Mating Scheme for Evolutionary Multiobjective Optimization

Lamarckian Repair and Darwinian Repair in EMO Algorithms for Multiobjective 0/1 Knapsack Problems

Comparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach

Incorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms

Multiobjective Formulations of Fuzzy Rule-Based Classification System Design

Multi-objective Optimization

EVOLUTIONARY algorithms (EAs) are a class of

NCGA : Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems

Evolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.

Improved S-CDAS using Crossover Controlling the Number of Crossed Genes for Many-objective Optimization

Effects of Three-Objective Genetic Rule Selection on the Generalization Ability of Fuzzy Rule-Based Systems

Finding Sets of Non-Dominated Solutions with High Spread and Well-Balanced Distribution using Generalized Strength Pareto Evolutionary Algorithm

GECCO 2007 Tutorial / Evolutionary Multiobjective Optimization. Eckart Zitzler ETH Zürich. weight = 750g profit = 5.

An Evolutionary Multi-Objective Crowding Algorithm (EMOCA): Benchmark Test Function Results

R2-IBEA: R2 Indicator Based Evolutionary Algorithm for Multiobjective Optimization

SPEA2+: Improving the Performance of the Strength Pareto Evolutionary Algorithm 2

Fuzzy Data Mining by Heuristic Rule Extraction and Multiobjective Genetic Rule Selection

Indicator-Based Selection in Multiobjective Search

SPEA2: Improving the strength pareto evolutionary algorithm

Finding a preferred diverse set of Pareto-optimal solutions for a limited number of function calls

division 1 division 2 division 3 Pareto Optimum Solution f 2 (x) Min Max (x) f 1

Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts

A Distance Metric for Evolutionary Many-Objective Optimization Algorithms Using User-Preferences

Communication Strategies in Distributed Evolutionary Algorithms for Multi-objective Optimization

Parallel Multi-objective Optimization using Master-Slave Model on Heterogeneous Resources

Evolutionary multi-objective algorithm design issues

Parallel Multi-objective Optimization using Master-Slave Model on Heterogeneous Resources

Multiobjective Association Rule Mining

Double Archive Pareto Local Search

Finding Knees in Multi-objective Optimization

Investigating the Effect of Parallelism in Decomposition Based Evolutionary Many-Objective Optimization Algorithms

Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA

Multi-Objective Optimization using Evolutionary Algorithms

Multiobjective Prototype Optimization with Evolved Improvement Steps

An Empirical Comparison of Several Recent Multi-Objective Evolutionary Algorithms

Multi-Objective Optimization using Evolutionary Algorithms

Multi-objective Optimization Algorithm based on Magnetotactic Bacterium

X/$ IEEE

Approximation-Guided Evolutionary Multi-Objective Optimization

Solving Multi-objective Optimisation Problems Using the Potential Pareto Regions Evolutionary Algorithm

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

Multi-objective pattern and feature selection by a genetic algorithm

Improved Pruning of Non-Dominated Solutions Based on Crowding Distance for Bi-Objective Optimization Problems

DCMOGADES: Distributed Cooperation model of Multi-Objective Genetic Algorithm with Distributed Scheme

A Predictive Pareto Dominance Based Algorithm for Many-Objective Problems

Optimizing Delivery Time in Multi-Objective Vehicle Routing Problems with Time Windows

Late Parallelization and Feedback Approaches for Distributed Computation of Evolutionary Multiobjective Optimization Algorithms

Graphical User Interface For Multi-Objective Decision Support

Multiobjective hboa, Clustering, and Scalability. Martin Pelikan Kumara Sastry David E. Goldberg. IlliGAL Report No February 2005

Exploration of Pareto Frontier Using a Fuzzy Controlled Hybrid Line Search

Design of Curves and Surfaces Using Multi-Objective Optimization

Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances

Comparing Algorithms, Representations and Operators for the Multi-Objective Knapsack Problem

Parallelizing Multi-Objective Evolutionary Algorithms: Cone Separation

GT HEURISTIC FOR SOLVING MULTI OBJECTIVE JOB SHOP SCHEDULING PROBLEMS

Multi-Objective Evolutionary Algorithms

minimizing minimizing

A Model-Based Evolutionary Algorithm for Bi-objective Optimization

Frontier Pareto-optimum

Efficient Hybrid Multi-Objective Evolutionary Algorithm

DEMO: Differential Evolution for Multiobjective Optimization

Reference Point-Based Particle Swarm Optimization Using a Steady-State Approach

Development of Evolutionary Multi-Objective Optimization

On The Effects of Archiving, Elitism, And Density Based Selection in Evolutionary Multi-Objective Optimization

Performance Evaluation of Vector Evaluated Gravitational Search Algorithm II

Using Different Many-Objective Techniques in Particle Swarm Optimization for Many Objective Problems: An Empirical Study

Multi-objective Optimization

Generalized Multiobjective Evolutionary Algorithm Guided by Descent Directions

Decomposition of Multi-Objective Evolutionary Algorithm based on Estimation of Distribution

Deconstructing Multi-objective Evolutionary Algorithms: An Iterative Analysis on the Permutation Flow-Shop Problem

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms.

Combining Convergence and Diversity in Evolutionary Multi-Objective Optimization

Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA

An External Archive Guided Multiobjective Evolutionary Approach Based on Decomposition for Continuous Optimization

Solving a hybrid flowshop scheduling problem with a decomposition technique and a fuzzy logic based method

Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design

Effects of Discrete Design-variable Precision on Real-Coded Genetic Algorithm

Multi-objective Ranking based Non-Dominant Module Clustering

Bayesian Optimization Algorithms for Multi-Objective Optimization

Preferences in Evolutionary Multi-Objective Optimisation with Noisy Fitness Functions: Hardware in the Loop Study

Balancing Survival of Feasible and Infeasible Solutions in Evolutionary Optimization Algorithms

A gradient-based multiobjective optimization technique using an adaptive weighting method

Available online at ScienceDirect. Procedia Computer Science 60 (2015 )

An Evolutionary Algorithm with Advanced Goal and Priority Specification for Multi-objective Optimization

arxiv: v1 [cs.ai] 2 Sep 2008

Evolutionary Multitasking for Multiobjective Continuous Optimization: Benchmark Problems, Performance Metrics and Baseline Results

Comparison of Multiobjective Evolutionary Algorithms: Empirical Results

Metaheuristics for the Bi-objective Ring Star Problem

THIS PAPER proposes a hybrid decoding to apply with

ROBUST MULTI-OBJECTIVE OPTIMIZATION OF WATER DISTRIBUTION NETWORKS

CHAPTER 30 USING DIVERSITY TO GUIDE THE SEARCH IN MULTI-OBJECTIVE OPTIMIZATION

Fuzzy-Pareto-Dominance and its Application in Evolutionary Multi-Objective Optimization

Dynamic Uniform Scaling for Multiobjective Genetic Algorithms

Improved Crowding Distance for NSGA-II

A Multi-Objective Approach for QoS-aware Service Composition

Multiobjective Optimization Using Adaptive Pareto Archived Evolution Strategy

The Multi-Objective Genetic Algorithm Based Techniques for Intrusion Detection

Visualization of Pareto-Sets in Evolutionary Multi-Objective Optimization

A Parallel Implementation of Multiobjective Particle Swarm Optimization Algorithm Based on Decomposition

A MULTIOBJECTIVE GENETIC ALGORITHM APPLIED TO MULTIVARIABLE CONTROL OPTIMIZATION

Transcription:

H. Ishibuchi and K. Narukawa, Recombination of parents in EMO algorithms, Lecture Notes in Computer Science 341: Evolutionary Multi-Criterion Optimization, pp. 265-279, Springer, Berlin, March 25. (Proc. of 3rd International Conference on Evolutionary Multi-Criterion Optimization, pp. 265-279, Guanajuato, Mexico, March 9-11, 25) Recombination of Similar Parents in EMO Algorithms Hisao Ishibuchi and Kaname Narukawa Department of Industrial Engineering, Osaka Prefecture University 1-1 Gakuen-cho, Sakai, Osaka, 599-8531, Japan {hisaoi, kaname}@ie.osakafu-u.ac.jp Abstract. This paper examines the effect of crossover operations on the performance of EMO algorithms through computational experiments on knapsack problems and flowshop scheduling problems using the algorithm. We focus on the relation between the performance of the algorithm and the ity of recombined parent solutions. First we show the necessity of crossover operations through computational experiments with various specifications of crossover and mutation probabilities. Next we examine the relation between the performance of the algorithm and the ity of recombined parent solutions. It is shown that the quality of obtained solution sets is improved by recombining parents. Then we examine the effect of increasing the selection pressure (i.e., increasing the tournament size) on the ity of recombined parent solutions. An interesting observation is that the increase in the tournament size leads to the recombination of dis parents, improves the diversity of solutions, and degrades the convergence performance of the algorithm. 1 Introduction Since Schaffer s study [21], various evolutionary multiobjective optimization (EMO) algorithms have been proposed to find well-distributed Pareto-optimal or near Paretooptimal solutions of multiobjective optimization problems (Coello et al. [2] and Deb [4]). Recent EMO algorithms usually share some common ideas such as elitism, fitness sharing and Pareto ranking. While mating restriction has been often discussed in the literature, it has not been used in many EMO algorithms as pointed out in some reviews on EMO algorithms [7, 23, 26]. In this paper, we examine the effect of recombining parents on the performance of EMO algorithms to find welldistributed Pareto-optimal or near Pareto-optimal solutions. Mating restriction was suggested by Goldberg [8] for single-objective genetic algorithms. Hajela & Lin [9] and Fonseca & Fleming [6] used it in their EMO algorithms. The basic idea of mating restriction is to ban the recombination of dis parents from which good offspring are not likely to be generated. In the implementation of mating restriction, a user-definable parameter σ mating called the mating radius is usually used for banning the recombination of two parents whose distance is larger than σ mating. The distance between two parents is measured in the decision space or the objective space. The necessity of mating restriction in EMO

algorithms was also stressed by Jaszkiewicz [17] and Kim et al. [18]. In the parallelization of EMO algorithms, mating restriction is implicitly realized since individuals are likely to be assigned to the same processor or the same island (e.g., see Branke et al. [1]). On the other hand, Zitzler & Thiele [25] reported that no improvement was achieved by mating restriction in their computational experiments. Van Veldhuizen & Lamont [23] mentioned that the empirical evidence presented in the literature could be interpreted as an argument either for or against the use of mating restriction. Moreover, there was also an argument for the selection of dis parents. Horn et al. [1] argued that information from very different types of tradeoffs could be combined to yield other kinds of good tradeoffs. Schaffer [21] examined the selection of dis parents but observed no improvement. A ity-based mating scheme was proposed in Ishibuchi & Shibata [13] to examine positive and negative effects of mating restriction on the performance of EMO algorithms. In their mating scheme, one parent (say Parent A) was chosen by the standard fitness-based binary tournament scheme while its mate (say Parent B) was chosen among a pre-specified number of candidates (say β candidates) based on their ity or disity to Parent A. To find β candidates, the standard fitness-based binary tournament selection was iterated β times. Almost the same idea was independently proposed in Huang [11] where Parent B was chosen from two candidates (i.e., the value of β was fixed as β = 2). Ishibuchi & Shibata [14] extended their ity-based mating scheme as shown in Fig. 1. That is, first a prespecified number of candidates (say α candidates) were selected by iterating the standard fitness-based binary tournament selection α times. Next the average vector of those candidates was calculated in the objective space. The most dis candidate to the average vector was chosen as Parent A. On the other hand, the most one to Parent A among β candidates was chosen as Parent B. Furthermore, it was demonstrated in [15] that the diversity-convergence balance can be dynamically adjusted by controlling the values of the two parameters α and β. Crossover Parent A Selection of the most extreme solution Parent A Parent B Selection of the most solution to parent A 1 2 α 1 2 β Fig. 1. Mating scheme in Ishibuchi & Shibata [14].

In this paper, we examine the effect of crossover operations on the performance of EMO algorithms through computational experiments on knapsack problems and flowshop scheduling problems using the algorithm of Deb et al. [5]. We focus on the relation between the performance of the algorithm and the ity of recombined parents. First we show the necessity of crossover operations through computational experiments with various specifications of crossover and mutation probabilities. It is shown that crossover operations play an important role in the algorithm while its performance is not very sensitive to the crossover probability if compared with the mutation probability. Next we examine the relation between the performance of the algorithm and the ity of recombined parents. We use the ity-based mating scheme in Fig. 1 to choose dis parents as well as parents. That is, the most or dis solution to Parent A among β candidates is chosen as Parent B in our computational experiments where the value of α is fixed as α = 1 (i.e., Parent A is selected by the standard fitness-based binary tournament selection). The ity is measured in the decision space and the objective space. It is shown that the performance of the algorithm can be improved by recombining parents. Then we examine the effect of increasing the selection pressure (i.e., increasing the tournament size) on the ity of recombined parents. An interesting observation is that the increase in the tournament size leads to the recombination of dis parents, improves the diversity of solutions, and degrades the convergence performance of the algorithm on some test problems. 2 Test Problems Multiobjective /1 knapsack problems with k knapsacks (i.e., k objectives and k constraints) and n items in Zitzler & Thiele [26] can be written as follows: Maximize f x) = ( f ( x), f ( x),..., f ( )), (1) subject to where ( 1 2 k x n wij x j ci, i 1,2,..., k j= 1 =, (2) f (x) p x, i = 1,2,..., k. (3) i = n j= 1 ij j In this formulation, x is an n-dimensional binary vector (i.e., ( x 1, x2,..., x n ) {, 1} ), p ij is the profit of item j according to knapsack i, w ij is the weight of item j according to knapsack i, and c i is the capacity of knapsack i. Each solution x is handled as a binary string of length n in EMO algorithms. The k-objective n-item knapsack problem is referred to as a k-n knapsack problem in this paper. Zitzler & Thiele [26] examined the performance of several EMO algorithms using nine test problems with two, three, four objectives and 25, 5, 75 items. In this paper, we use the three 5-item test problems (i.e., 2-5, 3-5, and 4-5 knapsack problems) while we can only report a part of experimental results due to the n

page limitation. We also generate two-objective and three-objective 2-machine 8-job flowshop scheduling problems in the same manner as Ishibuchi et al. [12, 16]. These test problems are denoted as 2-2-8 and 3-2-8 scheduling problems, respectively. The makespan and the maximum tardiness are considered as two objectives to be minimized in the 2-2-8 scheduling problem. In addition to these two objectives, the minimization of the total flowtime is considered in the 3-2-8 scheduling problem. Each solution of these two test problems is represented as a permutation of 8 jobs. In the case of the knapsack problems, the distance between two solutions (i.e., two binary strings) is measured by the Hamming distance in the decision space. On the other hand, the distance of two solutions (i.e., two permutations of 8 jobs) is calculated as the sum of the distance between the positions of each job. The calculation of the distance between two permutation-type strings is illustrated in Fig. 2. The distance between the positions of Job 1 (denoted by J1 in Fig. 2) is 4 since it is placed in the first position of String 1 and the fifth position of String 2. The distance between the positions of the other jobs is calculated in the same manner (i.e., 1 for Job 2, for Job 3 and Job 4, and 3 for Job 5). Thus the distance between the two strings in Fig. 2 is calculated as 8 (i.e., 4 + 1+ + + 3 ). String 1 J1 J2 J3 J4 J5 String 2 J2 J5 J3 J4 J1 Fig. 2. Distance between two strings for five-job flowshop scheduling problems. The distance between two solutions in the objective space is calculated by the Euclidean distance in both the knapsack problems and the scheduling problems. That is, the distance between two solutions x and y is calculated in the objective space as f( x) f( y) = f 2 2 1 ( x) f1( y) + + f k ( x) fk ( y), (4) where f( x) = ( f1( x),..., fk ( x)) is the k-dimensional objective vector corresponding to the solution x. 3 Performance Measures Various performance measures have been proposed in the literature to evaluate a nondominated solution set. As explained in Knowles & Corne [19], Okabe et al. [2], and Zitzler et al. [27], no single performance measure can simultaneously evaluate various aspects of a non-dominated solution set (e.g., convergence and diversity). Moreover, some performance measures are not designed to simultaneously compare many solution sets but to compare only two solution sets with each other. For various

performance measures, see [2, 4, 19, 2, 27]. In this paper, we use the following performance measures to simultaneously compare a number of solution sets: 1. Generational distance (GD) 2. D1 R measure ( D1 R ) 3. measure () 4. Hypervolume measure (Hypervolume) 5. Ratio of non-dominated solutions (Ratio) Let S and S * be a set of non-dominated solutions and the set of all Paretooptimal solutions. The generational distance [22] is the average distance from each solution in S to its nearest Pareto-optimal solution in S *. The distance is measured in the objective space using the Euclidean distance. On the other hand, the D1 R measure is the average distance from each Pareto-optimal solution in S * to its nearest solution in S. This measure was used in Czyzak & Jaszkiewicz [3]. The generational distance and the D1 R measure need all Pareto-optimal solutions of each test problem. Since true Pareto-optimal solutions are not available for each test problem except for the 2-5 knapsack problem, we use as S * a set of near Pareto-optimal solutions obtained for each test problem using much longer CPU time and much larger memory storage than computational experiments reported in this paper. The spread measure is calculated for the solution set S as follows: k = [ max{ fi ( x)} min{ fi ( x)}]. (5) i= 1 x S x S This measure is to the maximum spread of Zitzler [24]. The hypervolume measure [25] calculates the volume of the dominated region by the solution set S in the objective space. The ratio of non-dominated solutions is calculated for a solution set with respect to other solution sets. We used this measure in our former studies to compare multiple non-dominate rule sets [12, 16]. This measure is to the coverage measure of Zitzler & Thiele [25], which was proposed to compare two non-dominated rule sets. Let us assume that we have m solution sets S 1, S 2,..., S m. By merging these solution sets, we construct another solution set S as S = S1 U S2 U... U Sm. Let S ND be the set of non-dominated solutions in S. The ratio of non-dominated solutions is calculated for each solution set S i as Si I SND / Si where S i denotes the cardinality of S i (i.e., the number of solutions in S i ). 4 Conditions of Computational Experiments Our computational experiments on the knapsack problems are performed using the algorithm under the following parameter specifications: Crossover probability (one-point crossover):.8, Mutation probability (bit-flip mutation): 1/5 (per bit),

Population size: 2 (2-5 problem), 25 (3-5 problem), 3 (4-5 problem), Stopping condition: 5 generations. The average value of each performance measure is calculated over 5 runs from different initial populations for each knapsack problem. On the other hand, we use the following parameter specifications for the scheduling problems: Crossover probability (two-point order crossover):.8, Mutation probability (shift mutation):.5 (per string), Population size: 2, Stopping condition: 5 generations. The average value of each performance measure is calculated over 2 runs with different initial populations for each scheduling problem. It should be noted that the above-mentioned parameter values are basic settings in our computational experiments. Various specifications of the crossover and mutation probabilities are examined in the next section. 5 Effects of Crossover Operations For examining the necessity of crossover operations in the algorithm, we apply it to each knapsack problem using the following specifications of the crossover and mutation probabilities: Crossover probability ( ) :.,.1,.2,.3,.4,.5,.6,.7,.8,.9, 1., Mutation probability ( ) :.1,.2,.5,.1,.2,...,.1 (per bit). In the application of the algorithm to each scheduling problem, the following parameter specifications are examined: Crossover probability ( ) :.,.1,.2,.3,.4,.5,.6,.7,.8,.9, 1., Mutation probability ( ) :.1,.2,.3,.4,.5,.6,.7,.8,.9, 1. (per string). Experimental results on the 3-5 knapsack problem are shown in Fig. 3. In these experimental results, we can not observe the necessity of the crossover operation. That is, good results are obtained even in the case of no crossover (i.e., PC = ). The performance of the algorithm in Fig. 3 is sensitive to the mutation probability ( ) and insensitive to the crossover probability ( ). While the crossover operation seems to be unnecessary in Fig. 3, its necessity is clearly shown by the average ratio of non-dominated solutions in Fig. 4 (a) where the 1 11 combinations of the crossover and mutation probabilities are compared. That is, 1 11 solution sets obtained from these combinations are compared to calculate the ratio of non-dominated solutions. From Fig. 4 (a), we can see that all solutions obtained from the with no crossover (i.e., PC = ) are dominated by other solutions obtained from that with crossover (i.e., PC > ). The same observation is obtained from computational experiments on the 3-2-8 scheduling problem in Fig.

4 (b). The performance of the algorithm for the 3-2-8 scheduling problem is not sensitive to the crossover probability as shown in Fig. 5. Similar results are obtained for all the five test problems in this paper. Generational distance 3 25 2 15 1 5.1.2.5.1.2.5.1.2.5.1 1..4.6.8.2. D1 R measure 3 25 2 15 1 5.1.2.5.1.2.5.1.2.5.1 1..4.6.8.2. 9 75 6 45 3 15.1.2.5.1.2.5.1.2.5.1 1..4.6.8.2. Hypervolume 7 6 5 4 3.1.2.5.1.2.5.1.2.5.1 1..4.6.8.2. Fig. 3. Experimental results on the 3-5 knapsack problem. Ratio of non-dominated solutions 5 4 3 2 1.1.2.5.1.2.5.1.2.5.1 1..4.6.8.2. Ratio of non-dominated solutions 5 4 3 2 1.1.2.3.4.5.6.7.8.9 1..8 1..6.4.2. (a) 3-5 knapsack problem (b) 3-2-8 scheduling problem Fig. 4. Average ratio of non-dominated solutions.

Generational distance 4 3 2 1.1.3.5.7.9. 1..5 D1 R measure 14 15 7 35.1.3.5.7.9. 1..5 35 28 21 14 7.1.3.5.7.9. 1..5 Hypervolume 8 75 7 65 6.1.3.5.7.9. 1..5 Fig. 5. Experimental results on the 3-2-8 scheduling problem. From careful observations of Fig. 3 and Fig. 5, we can see that the use of crossover operations improved the convergence of solutions to the Pareto front and degraded the diversity of solutions. Such effects of crossover operations are visually demonstrated in Fig. 6 for the 2-5 knapsack problem. Fig. 6 shows non-dominated solutions at each generation of a single run of the algorithm with crossover (Fig. 6 (a)) and without crossover (Fig. 6 (b)). The mutation rate was specified as 1/5 (i.e., its basic setting) in Fig. 6. 21 2 2th generation 5th generation 2th generation 1th generation 21 2 2th generation 5th generation 2th generation 1th generation Total profit (knapsack 2) 19 18 17 Pareto front Total profit (knapsack 2) 19 18 17 Pareto front 16 16 17 18 19 2 Total profit (knapsack 1) 16 16 17 18 19 2 Total profit (knapsack 1) (a) with crossover. (a) without crossover. Fig. 6. Non-dominated solutions at each generation for the 2-5 knapsack problem.

6 Effects of Recombination of Similar Parents We examine the effect of recombining parents using the mating scheme in Fig. 1. When a pair of parents is to be chosen, one parent (say Parent A) is selected by the standard fitness-based binary tournament selection in the same manner as the NSGA- II algorithm. That is, the value of α in Fig. 1 is fixed as α = 1 in order to focus on the effect of recombining parents. Next we iterate the standard fitness-based binary tournament selection β times to find β candidates for the selection of the other parent (say Parent B). The most candidate to Parent A is chosen as Parent B (i.e., the mate of Parent A). For comparison, we also examine the choice of the most dis parent to Parent A. This mating scheme is exactly the same as the standard fitness-based binary tournament selection in the algorithm when the values of α and β are specified as α = 1 and β = 1. A large value of β means a strong bias toward the choice of (or dis) parents. We apply the algorithm with the mating scheme in Fig. 1 to each test problem using various values of β. Experimental results on the 3-5 knapsack problem are summarized in Fig. 7. We can see from Fig. 7 that the performance of the algorithm is improved by recombining parents. We can also see that results are obtained independent of the choice between the objective space and the decision space where the ity of solutions is measured. The average distance between recombined parents is shown in Fig. 8. The improvement in the performance of the algorithm is also observed in computational experiments on the 3-2- 8 scheduling problem in Fig. 9 where we show average results over 5 runs (Fig. 4 (b) and Fig. 5 were average results over 2 runs on the 3-2-8 scheduling problem). Generational distance GD 65 95 55 45 35 D1 R measure D1R 85 75 65 25 55 1 9 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 1 1 9 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 1 dis dis 6 64 55 5 45 Hypervolume HV 4 58 1 9 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 1 1 9 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 1 dis dis 63 62 61 6 59 Fig. 7. Effects of recombining parents for the 3-5 knapsack problem.

Euclidean Euclidean distance 15 12 9 6 3 Hamming Hamming distance 1 8 6 4 2 1 9 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 1 dis (a) Objective space 1 9 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 1 dis (b) Decision space Fig. 8. Average distance between recombined parents for the 3-5 knapsack problem. Generational distance GD 15 575 95 85 75 D1 R measure D1R 525 475 425 375 325 65 275 1 9 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 1 1 9 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 1 dis dis 185 735 175 165 155 145 Hypervolume HV 73 725 72 715 135 71 1 9 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 1 1 9 8 7 6 5 4 3 2 1 2 3 4 5 6 7 8 9 1 dis dis Fig. 9. Effects of recombining parents for the 3-2-8 scheduling problem. We visually show the effect of choosing parents on the performance of the algorithm for the 2-5 knapsack problem in Fig. 1 (a) where β was specified as β = 5. From the comparison between Fig. 1 (a) and Fig. 6 (a) by the original algorithm, we can see that the recombination of parents improved the performance of the algorithm (especially with respect to the diversity of solutions). Such improvement was observed in Fig. 7 and Fig. 9. The effect of our ity-based mating scheme in Fig. 1 becomes much clearer if we choose extreme and parents using the two parameters α and β. We visually show the effect of choosing extreme and parents in Fig. 1 (b) where the values of the two parameters α and β were specified as α = 5 and β = 5. We can clearly see that the diversity of solutions was improved in Fig. 1 (b) from Fig. 6 (a).

21 2 2th generation 5th generation 2th generation 1th generation 21 2 2th generation 5th generation 2th generation 1th generation Total profit (knapsack 2) 19 18 17 Pareto front Total profit (knapsack 2) 19 18 17 Pareto front 16 16 17 18 19 2 Total profit (knapsack 1) 16 16 17 18 19 2 Total profit (knapsack 1) (a) with α = 1 and β = 5. (b) with α = 5 and β = 5. Fig. 1. Effects of our meting scheme on the performance of the algorithm. 7 Effects of High Selection Pressure We examine the performance of the algorithm on each test problem using various specifications of the tournament size. Experimental results on the 3-5 knapsack problem are shown in Fig. 11. Each closed circle in Fig. 11 shows the average result over 5 runs for each specification of the tournament size. The standard deviation is also shown in Fig. 11 as the radius of each interval (i.e., the distance between the closed circle and each edge of the interval). It should be noted in Fig. 11 that the tournament size 1 means the random selection from the current population. Even in this case, the algorithm still has a somewhat strong selection pressure since the best individuals are chosen from the parent population and the offspring population in the generation update phase. We can see from Fig. 11 that the increase in the tournament size improves the diversity of solutions (i.e., the spread and hypervolume measures) and degrades the convergence of solutions to the Pareto front (i.e., the generational distance and the D1 R measure). This observation may be counterintuitive for some readers because the strong selection pressure does not lead to the improvement of the convergence but the improvement of the diversity in Fig. 11. During the computational experiments in Fig. 11, we also measure the distance between recombined parents. The average distance between recombined parents is shown in Fig. 12. From this figure, we can see that the increase in the tournament size leads to the recombination of dis parents. Similar results are also obtained for the 3-2-8 scheduling problem in Fig. 13 and Fig. 14. Average results over 5 runs are shown in these figures. In Fig. 15, we visually show the effect of increasing the tournament size on the performance of the algorithm for the 2-5 knapsack problem. From the comparison of Fig. 15 with Fig. 6 (a) by the binary tournament selection, we can see that the increase in the tournament size (i.e., the use of a higher selection pressure) improved the diversity of solutions while it degraded the convergence of solutions to the Pareto front. Such observations were also obtained from Fig. 11 and Fig. 14.

12 13 Generational distance GD 1 8 6 4 2 D1 R measure D1R 11 9 7 5 1 2 4 6 8 1 12 14 16 18 2 3 4 5 1 2 4 6 8 1 12 14 16 18 2 3 4 5 11 65 9 7 5 3 Hypervolume HV 1 2 4 6 8 1 12 14 16 18 2 3 4 5 1 2 4 6 8 1 12 14 16 18 2 3 4 5 64 63 62 61 6 59 Fig. 11. Effects of the tournament size for the 3-5 knapsack problem. 2 1 Euclidean Euclideam distance 15 1 5 Hamming Hamming distance 8 6 4 2 1 2 4 6 8 1 12 14 16 18 2 3 4 5 (a) Objective space 1 2 4 6 8 1 12 14 16 18 2 3 4 5 (b) Decision space Fig. 12. Average distance between recombined parents for the 3-5 knapsack problem. Euclidean Euclideam distance 1 8 6 4 2 1 2 4 6 8 1 12 14 16 18 2 3 4 5 (a) Objective space Distance between Permutation permutation string 12 1 8 6 4 2 1 2 4 6 8 1 12 14 16 18 2 3 4 5 (b) Decision space Fig. 13. Average distance between recombined parents for the 3-2-8 scheduling problem.

21 5 Generational distance GD 17 13 9 5 D1 R measure D1R 1 2 4 6 8 1 12 14 16 18 2 3 4 5 1 2 4 6 8 1 12 14 16 18 2 3 4 5 45 4 35 3 25 31 76 28 25 22 19 16 13 1 Hypervolume HV 75 74 73 72 71 7 1 2 4 6 8 1 12 14 16 18 2 3 4 5 1 2 4 6 8 1 12 14 16 18 2 3 4 5 Fig. 14. Effects of the tournament size for the 3-2-8 scheduling problem. 21 2 2th generation 5th generation 2th generation 1th generation 21 2 2th generation 5th generation 2th generation 1th generation Total profit (knapsack 2) 19 18 17 Pareto front Total profit (knapsack 2) 19 18 17 Pareto front 16 16 16 17 18 19 2 Total profit (knapsack 1) 16 17 18 19 2 Total profit (knapsack 1) (a) is 1. (b) is 5. Fig. 15. Effects of the tournament size on the performance of the algorithm. 8 Concluding Remarks Through computational experiments on multiobjective /1 knapsack problems and multiobjective flowshop scheduling problems, we examined the effect of crossover operations on the performance of the algorithm. First we showed that crossover operations played an important role while mutation operations seemed to have a larger effect on the performance of the algorithm. Next we empirically demonstrated that the recombination of parents improved the

performance of the algorithm. Then we examined the relation between the size of the tournament selection and the ity of recombined parents. Experimental results showed that the use of a higher selection pressure (i.e., the increase in the tournament size) decreased the ity of recombined solutions, improved the diversity of solutions, and degraded the convergence of solutions to the Pareto front. These effects of a higher selection pressure should be further examined since they are somewhat counterintuitive. That is, one may think that the use of a higher selection pressure may lead to the improvement in the convergence of solutions to the Pareto front and the deterioration in the diversity of solutions. The authors would like to thank the financial support from Japan Society for the Promotion of Science (JSPS) through Grand-in-Aid for Scientific Research (B): KAKENHI (1438194). References 1. Branke, J., Schmeck, H., Deb, K., and Reddy, S. M.: Parallelizing Multi-Objective Evolutionary Algorithms: Cone Separation. Proc. of 24 Congress on Evolutionary Computation (24) 1952-1957. 2. Coello Coello, C. A., Van Veldhuizen, D. A., and Lamont, G. B.: Evolutionary Algorithms for Solving Multi-Objective Problems. Kluwer Academic Publishers, Boston (22). 3. Czyzak, P. and Jaszkiewicz, A.: Pareto-Simulated Annealing A Metaheuristic Technique for Multi-Objective Combinatorial Optimization. Journal of Multi-Criteria Decision Analysis 7 (1998) 34-47. 4. Deb, K.: Multi-Objective Optimization Using Evolutionary Algorithms. John Wiley & Sons, Chichester (21). 5. Deb, K., Pratap, A., Agarwal, S., and Meyarivan, T.: A Fast and Elitist Multiobjective Genetic Algorithm:. IEEE Trans. on Evolutionary Computation 6 (22) 182-197. 6. Fonseca, C. M. and Fleming, P. J.: Genetic Algorithms for Multiobjective Optimization: Formulation, Discussion and Generalization. Proc. of 5th International Conference on Genetic Algorithms (1993) 416-423. 7. Fonseca, C. M. and Fleming, P. J.: An Overview of Evolutionary Algorithms in Multiobjective Optimization. Evolutionary Computation 3 (1995) 1-16. 8. Goldberg, D. E.: Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading (1989). 9. Hajela, P. and Lin, C. Y.: Genetic Search Strategies in Multicriterion Optimal Design. Structural Optimization 4 (1992) 99-17. 1. Horn, J., Nafpliotis, N., and Goldberg, D. E.: A Niched Pareto Genetic Algorithm for Multi- Objective Optimization. Proc. of 1st IEEE International Conference on Evolutionary Computation (1994) 82-87. 11. Huang, C. F.: Using an Immune System Model to Explore Mate Selection in Genetic Algorithms. Lecture Notes in Computer Science, Vol. 2723 (Proc. of GECCO 23), Springer, Berlin (23) 141-152. 12. Ishibuchi, H. and Murata, T.: A Multi-Objective Genetic Local Search Algorithm and Its Application to Flowshop Scheduling. IEEE Trans. on Systems, Man, and Cybernetics - Part C: Applications and Reviews 28 (1998) 392-43. 13. Ishibuchi, H. and Shibata, Y.: An Empirical Study on the Effect of Mating Restriction on

the Search Ability of EMO Algorithms. Lecture Notes in Computer Science, Vol. 2632 (Proc. of EMO 23), Springer, Berlin (23) 433-447. 14. Ishibuchi, H. and Shibata, Y.: A Similarity-based Mating Scheme for Evolutionary Multiobjective Optimization. Lecture Notes in Computer Science, Vol. 2723 (Proc. of GECCO 23), Springer, Berlin (23) 165-176. 15. Ishibuchi, H. and Shibata, Y.: Mating Scheme for Controlling the Diversity-Convergence Balance for Multiobjective Optimization. Lecture Notes in Computer Science, Vol. 312 (Proc. of GECCO 24), Springer, Berlin (24) 1259-1271. 16. Ishibuchi, H., Yoshida, T., and Murata, T.: Balance between Genetic Search and Local Search in Memetic Algorithms for Multiobjective Permutation Flowshop Scheduling. IEEE Trans. on Evolutionary Computation 7 (23) 24-223. 17. Jaszkiewicz, A.: On the Performance of Multiple-Objective Genetic Local Search on the /1 Knapsack Problem - A Comparative Experiment. IEEE Trans. on Evolutionary Computation 6 (22) 42-412. 18. Kim M., Hiroyasu, T., Miki, M., and Watanabe, S.: SPEA2+: Improving the Performance of the Strength Pareto Evolutionary Algorithm 2. Lecture Notes in Computer Science, Vol. 3242 (Proc. of PPSN VIII), Springer, Berlin (24) 742-751. 19. Knowles, J. D. and Corne, D. W.: On Metrics for Comparing Non-dominated Sets. Proc. of 22 Congress on Evolutionary Computation (22) 711-716. 2. Okabe, T., Jin, Y., and Sendhoff, B.: A Critical Survey of Performance Indices for Multi- Objective Optimization. Proc. of 23 Congress on Evolutionary Computation (23) 878-885. 21.Schaffer, J. D.: Multiple Objective Optimization with Vector Evaluated Genetic Algorithms. Proc. of 1st International Conference on Genetic Algorithms and Their Applications (1985) 93-1. 22. Van Veldhuizen, D. A.: Multiobjective Evolutionary Algorithms: Classifications, Analyses, and New Innovations. Ph. D dissertation, Air Force Institute of Technology (1999). 23. Van Veldhuizen, D. A. and Lamont, G. B.: Multiobjective Evolutionary Algorithms: Analyzing the State-of-the-Art. Evolutionary Computation 8 (2) 125-147. 24. Zitzler, E.: Evolutionary Algorithms for Multiobjective Optimization: Methods and Applications. Ph. D dissertation, Shaker Verlag, Aachen (1999). 25. Zitzler, E. and Thiele, L.: Multiobjective Optimization using Evolutionary Algorithms A Comparative Case Study. Proc. of 5th International Conference on Parallel Problem Solving from Nature (1998) 292-31. 26. Zitzler, E. and Thiele, L.: Multiobjective Evolutionary Algorithms: A Comparative Case Study and the Strength Pareto Approach, IEEE Transactions on Evolutionary Computation 3 (1999) 257-271. 27. Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C. M., and da Fonseca, V. G.: Performance Assessment of Multiobjective Optimizers: An Analysis and Review. IEEE Trans. on Evolutionary Computation 7 (23) 117-132.