An Angle based Constrained Many-objective Evolutionary Algorithm
|
|
- Alan Cameron
- 5 years ago
- Views:
Transcription
1 APPLIED INTELLIGENCE manuscript No. (will be inserted by the editor) An Angle based Constrained Many-objective Evolutionary Algorithm Yi Xiang Jing Peng Yuren Zhou Li Zefeng Chen Miqing Received: date / Accepted: date Abstract Having successfully handled many-objective optimization problems with box constraints only by using VaEA, a vector angle based many-objective evolutionary algorithm in our precursor study, this paper extended VaEA to solve generic constrained many-objective optimization problems. The proposed algorithm (denoted by CVaEA) differs from the original one mainly in the mating selection and the environmental selection, which are made suitable in the presence of infeasible solutions. Furthermore, we suggest a set of new constrained many-objective test problems which have different ranges of function values for all the objectives. Compared with normalized problems, this set of scaled ones is more applicable to test an algorithm s performance. This is due to the nature property of practical problems being usually far from normalization. The proposed CVaEA was compared with two latest constrained many-objective optimization methods on the proposed test problems with up to objectives, and on a constrained engineering problem from practice. It was shown by the simulation results that CVaEA could find a set of well converged and properly distributed solutions, and, compared with its competitors, obtained a better balance between convergence and diversity. This, and the original VaEA paper, together demonstrate the usefulness and efficiency of vector angle based algorithms for handling both constrained and unconstrained many-objective optimization problems. Corresponding author: Y. Zhou This paper is supported by the National Natural Science Foundation of China (Grant nos and ), and the Scientific Research Special Plan of Guangzhou Science and Technology Programme (Grant no. 674) Yi Xiang, Jing Peng, Yuren Zhou and Zefeng Chen School of Data and Computer Science & Collaborative Innovation Center of High Performance Computing, Sun Yat-sen University, Guangzhou 6, P. R. China. gzhuxiang yi@63.com (Y. Xiang), zhouyuren@mail.sysu.edu.cn (Y. Zhou) Miqing Li Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA), School of Computer Science, University of Birmingham, Birmingham B TT, U. K.
2 Y. Xiang et al. Keywords Many-objective optimization; constraint handling; evolutionary algorithms; VaEA. Introduction Recently, many-objective optimization problems (MaOPs) have caught much attention from the evolutionary computation community. Such problems refer to multiobjective optimization problems (MOPs) with more than three objectives. The MaOPs can cause great difficulties for traditional multi-objective evolutionary algorithms (MOEAs) that work quite well for problems with two or three objectives. The performance degradation of Pareto-based MOEAs, such as NSGAII [8], SPEA [37], and others [, 7], is mainly due to the insufficient selection pressure toward the Pareto front with the increase of the number of objectives []. MaOPs are widely seen in water distribution system design [], automotive engine calibration problem [], airfoil designing problem [9], and [8,]. Therefore, it is worth making great efforts to efficiently solve them. Generally, MaOPs can be classified into two groups: the constrained and unconstrained (with box constraints only) problems. In this paper, we consider the following constrained many-objective optimization problems with J inequality and K equality constraints. Minimize F(x) = (f (x), f (x),..., f M (x)) T, s.t. g j (x), j =,,..., J, h k (x) =, k =,,..., K, x Ω, () where M is the number of objectives, and M 4. And x = (x, x,..., x n ) T is the decision vector, where n is the number of decision variables. In (), Ω = n i= [x(l) i, x (U) i ] R n is called the decision space, where x (L) i and x (U) i are the lower and upper bounds of the decision variable x i, respectively. If we omit the inequality and equality constraints in MaOP (), then we get unconstrained (with box constraints only) problems which can be stated as follows. Minimize F(x) = (f (x), f (x),..., f M (x)) T, s.t. x Ω. () For MaOP (), by dealing with the challenges in solving MaOPs to some extent, some specific many-objective evolutionary algorithms (MaOEAs) have been proposed in recent years. Yang et al. [3] proposed a grid-based evolutionary algorithm (GrEA) for many-objective optimization problems. In GrEA, the grid dominance and grid difference are used to increase the selection pressure toward the optimal fronts. In addition, three grid-based criteria, i.e., grid ranking, grid crowding distance and grid coordinate point distance, are incorporated to help maintain an extensive and uniform distribution among solutions. Li et al. [3] suggested using a shift-based density estimation (SDE) strategy to measure the density of the population so as
3 CVaEA 3 to make Pareto-based algorithms suitable for MaOPs. In SDE, the density of a solution s neighborhood is measured by shifting the positions of other solutions according to their convergence comparison with the current solution [8]. Thus, SDE covers both the distribution and convergence information of individuals. The SDE was applied to three popular Pareto-based algorithms (i.e., NSGA-II, SPEA, and PESA-II), and the experimental results have demonstrated its usefulness in handling many-objective problems. Praditwong and Yao [4], Wang et al. [7] proposed the two-archive algorithm (Two Arch) and its improved version Two Arch. In these algorithms, two archives are used, focusing on convergence and diversity, respectively. In Two Arch, different selection principles (i.e., indicator-based and Pareto-based) are assigned to the two archives. In addition, a new L p -norm-based (p < ) diversity maintenance scheme is designed. It was shown by the experimental results that the Two Arch performed well on a set of MaOPs with up to objectives. Recently, Li et al. [] have proposed a Bi-Goal Evolution (BiGE) to optimize problems with many objectives. In BiGE, a given MaOP is converted into a bi-goal (objective) optimization problem regarding proximity and diversity. Then, in this bi-goal domain, the Pareto dominance relation can be applied to handle the new problem well. The BiGE has been found to be very competitive against other state-of-the-art algorithms, and it suggested a completely new way of addressing many-objective problems. With the help of a set of supplied reference points or weight vectors, some reference set based or decomposition based evolutionary algorithms have been suggested for MaOPs. Deb and Jain [6] proposed a reference-point-based many-objective evolutionary algorithm (called NSGAIII) by following the NSGAII framework. In NSGAI- II, more emphases are put on non-dominated population members, and the diversity within the population is maintained by using a number of well-distributed reference points. The proposed NSGAIII is found to be able to produce satisfactory results on a set of test problems with three to fifteen objectives. Li et al. [] suggested a unified paradigm named MOEA/DD that combines dominance- and decomposition-based approaches for dealing with MaOPs. In MOEA/DD, a set of weight vectors is used to specify subregions in the objective space, aiming at maintaining a good distribution among solutions. In addition, an efficient procedure [6] is employed to update the non-domination level structure of the population after an offspring solution is introduced. The empirical results have demonstrated the effectiveness of the proposed MOEA/DD in finding a set of well-converged and well-distributed solutions for the DTLZ and WFG test suites. Some other evolutionary algorithms concentrating on MaOP () can be found in [,, 4, 6, 33, 3]. From the literature review, it seems that much more attention have been restricted on handling unconstrained MaOPs only. And there is not enough literature on dealing with constraints in an MaOP. Representatives for constrained MaOP () can be reviewed as follows. Jan and Zhang [4] proposed a constrained MOEA/D-DE by modifying the replacement and update schemes in the original approach [9]. In the constrained algorithm, a penalty function is adopted to penalize infeasible solutions that requires two control parameters. In addition, to make the approach applicable, four other parameters are also required. Having realized the difficulty in the parameter configuration, Jain and Deb proposed the C-MOEA/D algorithm by making the following two major modifications to the original MOEA/D-
4 4 Y. Xiang et al. DE approach [3]: () Instead of replacing the member just based on a performance metric (PBI or Tchebycheff), the constraint violation (CV), if any, of a child solution y and its randomly picked neighbor x is checked, and adequate emphasis are put on feasible and small-cv solutions; () When creating the offspring population, the DE operator is replaced by the SBX and polynomial mutation operators. The experimental results have shown that C-MOEA/D was a competing and alternate algorithm for MaOPs with constraints. Also in [3], the authors proposed another constrained MaOEA based on the framework of NSGAIII procedure, namely CNSGAIII. The differences between CNSGAIII and the original unconstrained one can be summarized as follows. First, in the presence of constraints, the constraint-dominance principle adopted in NSGAII [8] is used to divide the population into different nondominated levels. Second, instead of randomly selecting pairs of members to create offsprings, CNSGAIII uses a modified tournament selection operation for choosing parents, which emphasizes feasible solutions over infeasible ones, and smaller-cv solutions over larger-cv ones. Li et al. [] introduced a constraint handling method in MOEA/DD, and proposed the CMOEA/DD to solve constrained MaOPs. In the algorithm, the survival of infeasible solution depends on both CVs and niching scenarios. Infeasible solutions, associated with an isolated subregion, are given a second chance to survive. In addition, a similar binary tournament selection procedure as in CNSGAIII is used to choose mating parents. In the precursor study [3], we suggested a many-objective evolutionary algorithm based on vector angles (denoted by VaEA). In this algorithm, the general framework is the same as in NSGAII or NSGAIII. However, the VaEA uses the maximum-vector-angle-first principle to keep a good distribution among solutions. Besides, another principle named the worse-elimination is adopted to conditionally replace worse solutions in terms of the convergence. Based on the above two principles, VaEA achieves a good balance between convergence and diversity. In this paper, we extend VaEA to solve constrained MaOPs by making the following main modifications to the original VaEA: () a modified tournament selection operation as in CNSGAIII is used to choose mating parents; () a portion of infeasible solutions is added into the new population before the inclusion of feasible ones. Certainly infeasible solutions with smaller constraint violations are preferred and the process aims at sufficiently utilizing the information provided by infeasible solutions. Especially, in the case when infeasible solutions are located at an isolated region, the inclusion of them will be good for the diversity promotion; (3) in the presence of infeasible solutions, the modified worse-elimination principle emphasizes feasible solutions over infeasible ones, and emphasizes solutions with smaller fitness value (i.e., the sum of all normalized objectives) over those with larger values. We denote the new algorithm by CVaEA hereafter. The main contributions of this work can be summarized as follows. An efficient and effective many-objective EA for constrained optimization problems is suggested. The proposed CVaEA inherits some good properties from the original VaEA, for example, it is free from a set of supplied reference points or weight vectors, and has the time complexity max{o(nlog M N), O(MN )} [3], where M is the number of objectives and N is the population size. The
5 CVaEA O(Nlog M N) is the time for the fast non-dominated sorting [] while O(MN ) is the required time for the association and niching operations [3]. A set of constrained scaled test problems is proposed. In this new test suite, each test problem has a different range of values for each objective. Therefore, it is more suitable to use them to test an algorithm s performance because practical problems are far from normalized ones (i.e., with an identical range for each objective). The proposed test problems can be scaled to any number of objectives, and so does the number of decision variables. The rest of this paper is organized as follows: We first describe our proposed CVaEA in detail in Section. Then, in Section 3, we verify the performance of C- VaEA through an experimental study, including the descriptions of the scaled test problems and the performance comparison against CNSGAIII and CMOEA/DD. In what followed, CVaEA is applied to a practical problem in Section 4. Finally, Section concludes the paper. Proposed VaEA with Constraint-Handing Approach This section first gives a brief review of VaEA, followed by the description of the general framework of CVaEA. Finally, we present details of the modifications in the mating and environmental selections in the presence of constraints.. A Brief Review of VaEA Before we describe the procedure for handling constrained optimization problems, we first give a brief review of the recently proposed VaEA described in the original paper [3]. The VaEA uses the same general framework as in NSGAII [8] or NSGAIII [6] procedures. By applying mating selection and genetic (crossover and mutation) operators, the current population P is used to create an offspring population Q. The union population S = P Q is then adaptively normalized. By utilizing the nondominated sorting procedure, the solutions in S are divided into different layers, i.e., F, F,... All members in layer to layer l are first included in a temporary set S t. If S t = N (N is the population size), then the next generation is started with P = S t. If S t > N, then all members up to the l layer are already included for the next generation, i.e., P = l F i, and the remaining K = N P solutions will be i= selected from the layer F l (called the critical layer) one by one. We start by defining the angle from a member x j F l to the set P as the vector angle between x j and its target solution to which x j has the minimum vector angle. Then, the maximum-vector-angle-first principle is used to select candidates from F l one by one to fill the population P. Specifically, the priority is given to the member in F l that has the maximum vector angle to P. After a member is added, we may need to update target solutions of the remaining solutions in F l. According to the above procedure, the proposed VaEA selects solutions dynamically and it is expected
6 6 Y. Xiang et al. to keep a well distributed population. Another principle named the worse-elimination is adopted to allow worse solutions in terms of the convergence to be conditionally replaced by other individuals so as to keep a balance between convergence and diversity. In VaEA, the convergence of a solution is measured by the sum of all normalized objectives. When a solution in P is replaced by a member in F l, the update of target solutions for the remaining members is also needed if necessary. The above procedure repeats until the population is full. The worst time complexity of VaEA is max{o(nlog M N), O(MN )} which is equivalent to that of NSGAIII. However, the VaEA has the following good properties: () it is free from a set of supplied reference points or weight vectors; () it introduces no additional algorithmic parameters; (3) the VaEA was found to be efficient and effective in solving problems having a large number of objectives.. General Framework of CVaEA The pseudo code of the proposed constrained approach is shown in Algorithm. The CVaEA shares a common framework that is employed by many evolutionary algorithms. First, a population with N solutions is randomly initialized in the whole decision space Ω. Then more potential solutions are selected into the mating pool P according to the fitness value of each individual. In what followed, a set of offspring solutions Q is generated by applying crossover and mutation operations to the population P. Finally, by applying the environmental selection, N solutions in the union set of P and Q survive into the next generation. The above steps repeat until the number of generations G reaches its maximum value, i.e., G max. Algorithm Framework of the proposed CVaEA : Initialization(P ) : G = 3: while G < G max do 4: P = Mating selection(p ) : Q = V ariation(p ) 6: S = P Q 7: P = Environmental selection(s) 8: G + + 9: end while The main differences between CVaEA and the original VaEA lie in the Mating selection (line 4 in Algorithm ) and the Environmental selection (line 7). We will describe these modifications in the following subsections, respectively..3 Modifications in the Mating Selection In CVaEA, the mating pool P is constructed as follows (line 4 in Algorithm ): () select two members p and p from the current population P at random; () apply the modified binary tournament selection operation [3] to p and p, and select a better
7 CVaEA 7 one; (3) repeat the above steps until N parents are selected. Note that some better solutions may be selected more than once. Before applying the binary tournament selection in step (), we use the CV value introduced in [3] to assess the quality of infeasible solutions. For calculating the CV value of an infeasible solution x [denoted by CV (x)], we first normalize all constraints by the method suggested in [3]. The normalized inequality and equality constraints are denoted by g j (x) and h k (x), respectively. Then CV (x) is given by the following equation. CV (x) = J K < g j (x) > + h k (x), (3) j= where the bracket operation < β > returns the negative of β if β <, and returns if β [3]. The smaller the CV value is, the better the solution will be. In step (), the conditions for choosing a solution between p and p, by using the binary tournament selection, are listed as follows. k= If p is feasible and p is infeasible, select p ; If p is feasible and p is infeasible, select p ; Both p and p are infeasible, then if p has smaller CV, select p ; Both p and p are infeasible, then if p has smaller CV, select p ; If both p and p are feasible, then p or p is selected randomly. After the mating pool is formed, the normal crossover and mutation operators are used to generate the offspring population Q (line in Algorithm ). Next, the population for the next generation is created by applying the environmental selection to the union of P and Q (lines 6-7 in Algorithm ). Details of the environmental selection will be given in the next section..4 Modifications in the Environmental Selection In the presence of constraints, the environmental selection procedure is different from that in VaEA. First, we divide the union set S of size N into two sets: the feasible solutions (set F) and infeasible solutions (set I). If the number of feasible solutions is no larger than N, i.e., F N, then we definitely add all feasible solutions into the new population P, and the remaining members are selected from the set I. This is simply realized by sorting infeasible solutions in an ascending order based on the CV values. Then the first N F solutions in I are selected to fill the population P. However, if F > N, meaning that there are more feasible solutions than required, we first normalize the set S as in the original NSGAIII paper using only feasible solutions. Then, some infeasible solutions, if any, are selected and added into the new population P first. The number of selected infeasible solutions N ifs is controlled by a parameter α, i.e., N ifs = αn. The parameter α takes a smaller value whose effect on the algorithm s performance will be investigated in Section 3. later. If I N ifs, then we definitely add all infeasible solutions. Otherwise, N ifs infeasible solutions having smaller CV values are preferred. Next, solutions in F are divided
8 8 Y. Xiang et al. into different layers and the critical layer F l is identified. Note that when applying the non-dominated sorting procedure, the number of included infeasible solutions should be considered. Finally, N P solutions are selected from the front F l based on t- wo principles in the original VaEA: the maximum-vector-angle-first principle and the worse-elimination principle. As stated in Section., the target solution of each member in F l is defined as the closest solution in P in terms of the angle. These target solutions are pre-calculated before applying the two principles, and may be changed during the selection process. Specifically, the maximum-vector-angle-first principle is the same as in the VaEA. Each time the solution in F l that has the maximum angle to the population P is chosen and included. However, there exist some differences in the worse-elimination principle in the presence of infeasible solutions. The pseudo-code of the modified worse-elimination principle is shown in Algorithm. Suppose x j F l has the minimum vector angle to P, and assume that x j is associated with y r P. If the angle between them is smaller than the threshold π/ N+ [3], then we will exchange x j and y r if one of the following conditions is true: () y r is infeasible, and () both are feasible, but x j has a smaller fitness value which is defined as the sum of the normalized objective values [3] (lines 3-4 in Algorithm ). After exchanging x j and y r (y r is now in F l and may be reconsidered in the next generation), we need to do two works: () finding the target solution of y r (line in Algorithm ). This can be achieved by a normal routine. First, calculate the vector angles between y r and each member in P. Second, find a minimum one among these angles. Finally, the corresponding solution is the target one we are looking for; and () updating target solutions of the remaining members in F l (line 6 in Algorithm ). To this end, the same procedure as in VaEA is employed here. We need to calculate the vector angle between each remaining member in F l and the newly added x j. If the angle is smaller than the original one, then the target solution is updated accordingly. Algorithm The modified worse-elimination principle : Find x j F l that has the minimum angle to P, and assume that x j is associated with y r : if The angle between x j and y r is smaller than π/ N+ then 3: if {feasible(y r ) = FALSE} or {feasible(y r ) = TRUE and F itness(x j ) < F itness(y r )} then 4: Exchange x j and y r : Determine the target solution of y r 6: Update target solutions of the remaining members in F l 7: end if 8: end if According to Algorithm, it is possible that some of the added infeasible solutions will be replaced by feasible ones while others may be not. For example, infeasible solutions located at an isolated region [] will not be substituted, and the preservation of them helps the algorithm to search around this poorly-explored region. Hence, this may be good for the diversity promotion [].
9 CVaEA 9 3 Experimental Study This section aims at verifying the performance of CVaEA through the experimental study. We compare our proposed algorithm with CNSGAIII and CMOEA/DD on a set of constrained scaled test problems originated from the CDTLZ benchmarks that were first introduced in [3]. 3. Test Problems In [3], three types of constrained problems were first developed on the basis of the DTLZ test suite [9]. In Type- problems, the original DTLZ problems were modified by adding a constraint which formed an infeasible barrier in approaching the Pareto-optimal front. In this type of problems, the original Pareto-optimal fronts are still optimal. However, an algorithm may find difficulty in solving these problems because it is not easy to overcome the infeasible regions in the objective space. By applying this principle, the authors got C-DTLZ and C-DTLZ3 that share the same objective functions with the original DTLZ and DTLZ3 problems. In C-DTLZ, a linear constraint is added and due to this constraint the feasible solutions appear only in a region of objective space close to the true front. Meanwhile, a non-linear constraint is added in C-DTLZ3 which provides a band of infeasible space adjacent to the Pareto-optimal front. Mathematical formulations of constraints can be found elsewhere [3]. For type- constrained problems, some parts of the Pareto-optimal front are made infeasible by adding a non-linear constraint. This kind of problems are used to test an algorithm s ability to deal with discontinuous Pareto-optimal fronts. The C-DTLZ and convex C-DTLZ were designed by applying the above principle to the original DTLZ [9] and convex DTLZ problems [6]. Unlike type- and type- problems that have only one constraint, the type-3 problems involve multiple constraints and the entire Pareto-optimal front of the unconstrained problem is no longer optimal. Instead, it is made up of parts of the added constrained surfaces. For this purpose, DTLZ and DTLZ4 were modified accordingly by adding M different constraints. Hence, two new problems C3-DTLZ and C3-DTLZ4 were obtained. For details of all types of constrained problems, please refer to [3]. The objectives of the constrained DTLZ family (or CDTLZ) share the same property: all of them are in the same range. For example, in C-DTLZ3, C-DTLZ and convex C-DTLZ, all the objectives f i, i =,,..., M are in the region [,]. S- ince these problems have an identical range of values for each objective, they are called normalized test problems [6]. However, as stated in [6], practical problems are far from being normalized and the objectives are usually scaled differently. Therefore, it is in great need of testing algorithms on problems with different scaling of objectives. Hence, following the practice in [6], this paper suggests a set of constrained scaled test problems by multiplying objective f i in CDTLZ problems by a factor r i, where r is the base of the factor. The scaled objectives are denoted by f i, i =,,..., M. Therefore, we obtain some new constrained test problem-
10 Y. Xiang et al. Table The settings of r for constrained scaled test problems. Problem C-SDTLZ C3-SDTLZ M r Problem M r C-SDTLZ3 C-SDTLZ C-SDTLZConvex C3-SDTLZ s: C-SDTLZ, C-SDTLZ3, C-SDTLZ, C-SDTLZConvex, C3-SDTLZ and C3-SDTLZ4. The settings of r for C-SDTLZ and C3-SDTLZ are listed on the left side of Table, while those for the remaining problems are presented on the right side in that table. For example, for the three-objective C-SDTLZ, the scaled objectives f, f and f3 are multiplied by, and, respectively. Since we have the relation fi = fi ri, we can easily obtain fi = fi /ri. By plugging it into the expression of the original constraints in the CDTLZ problems, the new constraints can be worked out. For example, in C-DTLZ, the original constraint is given by the following expression [3]. c(x) = M fm (x) fi (x)..6. i= (4) In C-SDTLZ, the constraint is then evaluated as follows. c (x) = M fm (x) fi (x)..6 rm. ri i= () Analogously, constraints in other test problems can also be calculated. Fig. shows the difference between C-DTLZ and C-SDTLZ through a two-dimensional case. In C-DTLZ, the feasible region is the black part between lines f + f =. f f and..6 = [Fig. (a)], while that in C-SDTLZ is marked by black backf f f = and. f6 = ground in Fig. (b), where the boundary lines are. (assuming r = ), respectively. 6.6 Feasible Feasible. 4 f f Pareto front. Pareto front...3 f (a) C-DTLZ f (b) C-SDTLZ Fig. Two-objective version of the C-DTLZ and C-SDTLZ problems..
11 CVaEA 3. Performance Metric The well-known IGD metric is used here to assess the performance of all the algorithms. This metric is a joined measurement of both the convergence and diversity of the obtained set. Let P be an approximation set, and P be a set of non-dominated points uniformly distributed along the true Pareto front, and then the IGD metric is defined as follows [3, 34] IGD(P ) = P z P dist(z, P ), (6) where dist(z, P ) is the Euclidean distance between z and its nearest neighbor in P, and P is the cardinality of P. If P is large enough to cover the true Pareto front very well, then both convergence and diversity of approximate set P can be measured by IGD(P ) [3]. For an EMO algorithm, a small IGD value is desirable. Table lists the number of points used for the calculation of IGD with respect to different numbers of objectives on each test problem. These points are generated according to the method in [6]. As the number of objectives increases, more points are needed to cover the true Pareto-optimal fronts as well as possible. Table The number of points in P Problem M Number of points Problem M Number of points C-SDTLZ C-SDTLZ C3-SDTLZ C-SDTLZ3 8 9,, 3,74 3, C-SDTLZConvex 8,94,74 8,88, 3, ,464 C3-SDTLZ4 8,48 3,6 6,77,46,4 3.3 General Experimental Settings The parameter settings for this experiment are listed as below unless otherwise mentioned. Population size: According to [7], the population size in CNSGAIII is set as the smallest multiple of four larger than the number of reference points (H) produced by the so-called two-layer reference point (or weight vector) generation method.
12 Y. Xiang et al. For CMOEA/DD, we use the population size, N = H, as recommend by its developers []. The CVaEA keeps the same population size as in CNSGAIII. The population size N for different numbers of objectives is summarized in Table 3. Number of independent runs and the termination condition: All algorithms are independently run times on each test instance and terminated when a predefined maximum function evaluations (MF E) reaches. The settings of MF E for different numbers of objectives are listed in Table 4. Parameter settings for operators: In all algorithms, the simulated binary crossover (SBX) and polynomial mutation are used to generate offspring solutions. The crossover probability p c and mutation probability p m are set to. and /n, respectively. For SBX operator, its distribution index is η c = 3, and the distribution index of mutation operator is η m = [7, ]. Parameter settings for algorithms: Following the practice in [], the penaltybased boundary intersection (PBI) approach is used in CMOEA/DD with a penalty parameter θ =. The neighborhood size T is set to and the probability used to select in the neighborhood is given by δ =.9. In CVaEA, the α is set to., and effect of this parameter will be investigated in Section 3.. Table 3 The population size for different numbers of objectives M H CNSGAIII&CVaEA CMOEA/DD Table 4 The settings of MF E for different numbers of objectives M MF E = population size G max G max denotes the number of the maximum generations that is used in CVaEA and CNSGAIII, while MF E is used in CMOEAD/D. 3.4 Simulation Results and Analyses Table gives the median and Inter Quartile Range (IQR) results on the three-objective test problems in terms of the IGD metric. The significance of the difference between
13 CVaEA 3 CVaEA and the peer algorithms is determined by using the well known Wilcoxon s rank sum test []. As shown, CVaEA performs significantly better than CNSGAI- II and CMOEAD/D on both C-SDTLZ and C3-SDTLZ. For C-SDTLZ3, all the three algorithms obtain similar performance, meaning that no significance differences are detected by the Wilcoxon s rank sum test. The CVaEA shows a significant improvement over CMOEAD/D on the C-SDTLZ problem, but its performance deteriorates when compared against the CNSGAIII. For problems C-SDTLZConvex and C4-SDTLZ4, the proposed algorithm is comparable to the well-known CNSGAI- II, however, it shows a clear superiority over CMOEAD/D on these two problems. Table Median and IQR (in brackets) of IGD metric for the three-objective test problems. The best and the second best results are shown with dark and light gray background, respectively. CVaEA CNSGAIII CMOEAD/D C-SDTLZ 7.8E (.E ).E + (.E ) 7.799E + (8.E + ) C-SDTLZ3.7E + (.8E + ).4E + (4.4E + ).67E + (3.3E ) C-SDTLZ.39E + (.9E ).89E + (4.8E ).983E + (.3E ) C-SDTLZConvex.E + (8.9E ).68E + (.9E ).88E + (7.6E ) C3-SDTLZ.436E + (.E ).E + (3.4E ).887E + (8.4E ) C3-SDTLZ4 3.E + (.8E ) 3.3E + (.E ) 4.34E + (.6E + ) indicates that the peer algorithm is significantly worse than CVaEA with a level of significance. by the Wilcoxon s rank sum test, while indicates the opposite. They are the same in Tables 6-9. Table 6 shows the results on the five-objective test problems. It can be seen that CVaEA performs best, presenting a clear advantage over other two algorithms on the majority of the test problems. The CVaEA significantly outperforms CNSGAIII and CMOEAD/D on almost all the test problems. As a special case, the proposed algorithm is defeated by CNSGAIII only on the C-DTLZConvex problem. Table 6 Median and IQR (in brackets) of IGD metric for the five-objective test problems. The best and the second best results are shown with dark and light gray background, respectively. CVaEA CNSGAIII CMOEAD/D C-SDTLZ.349E + (.E + ).7E + (7.8E + ).64E + 4 (4.E + 4) C-SDTLZ3.4E + 3 (4.3E + ).6E + 3 (8.4E + ) 3.E + 3 (7.4E + ) C-SDTLZ.E + (.E + ).77E + (.4E + ) 3.34E + 3 (.E 3) C-SDTLZConvex.67E + (.4E + ).849E + (3.7E + ).98E + 3 (3.E ) C3-SDTLZ.73E + (3.6E + ) 3.3E + (4.7E + ).E + 3 (.7E ) C3-SDTLZ4.8E + (4.E + ) 4.73E + (3.E + ) 4.894E + 3 (6.3E + ) Results on the eight- and ten-objective problems are presented in Tables 7 and 8, respectively. The CVaEA significantly outperforms CMOEAD/D on all the test problems for both M = 8 and M =. From the Wilcoxon test results, we can see that the proposed algorithm shows an obvious improvement over CNSGAIII on the majority of the test instances, except for the eight- and ten-objective C-SDTLZConvex, and the ten-objective C-SDTLZ3 problem. For C-SDTLZ3 with eight objectives, the difference between CVaEA and CNSGAIII is negligible. Table 9 lists the IGD results on the fifteen-objective test problems. It can be found that CVaEA performs relatively worse than CNSGAIII and CMOEAD/D on the C- SDTLZ3 and C3-SDTLZ problems. For the C-SDTLZConvex problem, CVaEA and CNSGAIII achieve similar performance. For all the other pairwise comparisons
14 4 Y. Xiang et al. Table 7 Median and IQR (in brackets) of IGD metric for the eight-objective test problems. The best and the second best results are shown with dark and light gray background, respectively. CVaEA CNSGAIII CMOEAD/D C-SDTLZ 3.378E + (.E + ).737E + (3.8E + ).9E + 4 (.E + 4) C-SDTLZ3 4.74E + (.E + ) 3.897E + (.6E + ).89E + (.7E + ) C-SDTLZ.63E + (6.E + ) 8.63E + (.E + ).7E + (.E + ) C-SDTLZConvex.6E + (8.3E + ) 6.67E + (7.7E + ) 3.9E + (7.8E ) C3-SDTLZ 4.96E + (7.E + ) 7.796E + (3.4E + ) 3.37E + (4.8E ) C3-SDTLZ4.E + (.3E + ).698E + (.E + ) 8.397E + (.7E + ) Table 8 Median and IQR (in brackets) of IGD metric for the ten-objective test problems. The best and the second best results are shown with dark and light gray background, respectively. CVaEA CNSGAIII CMOEAD/D C-SDTLZ 8.33E + (.7E + ).48E + (4.3E ) 8.39E + 3 (3.E + 3) C-SDTLZ3.68E + 3 (7.4E + ).96E + 3 (3.E + ) 4.63E + 3 (6.3E + ) C-SDTLZ 3.63E + (4.9E + ) 4.9E + (6.6E + ) 4.38E + 3 (6.6E ) C-SDTLZConvex 6.66E + (3.8E + ) 4.78E + (.E + ).996E + 3 (4.6E ) C3-SDTLZ.446E + (.8E + ).7E + (6.E + ) 7.34E + (4.E ) C3-SDTLZ4 6.8E + (.4E + ).387E + 3 (.6E + ) 6.33E + 3 (.8E + ) between CVaEA and CNSGAIII (or CMOEAD/D), our proposed algorithm presents significant better results than its competitors. Table 9 Median and IQR (in brackets) of IGD metric for the fifteen-objective test problems. The best and the second best results are shown with dark and light gray background, respectively. CVaEA CNSGAIII CMOEAD/D C-SDTLZ 7.78E (6.3E ) 9.7E (3.4E 3) 4.46E + (.3E + ) C-SDTLZ E + 4 (4.E + 4).43E + 3 (4.8E + ) 3.49E + 3 (.E + ) C-SDTLZ 7.E + (.E + ) 7.8E + (.9E + ).84E + 3 (.6E + ) C-SDTLZConvex 8.94E + (7.3E + ) 6.66E + (.E + ).E + 3 (.E ) C3-SDTLZ.4E + (.E + ).64E + (.6E ).93E + (.7E ) C3-SDTLZ4 9.33E + (.E + ).873E + 3 (.6E + ) 4.636E + 3 (4.E ) We summarize the results of the pairwise comparisons in Table, where nb, ne and nw denote the number of test instances in which CVaEA shows better, equal and worse performance than the peer algorithms, respectively. Specifically, the proportion of the test instances where CVaEA performs better than CNSGAIII and CMOEAD/D is 8/3 and 7/3, respectively. Conversely, the proportion that CVaEA is defeated by the peer algorithms is 7/3 and /3, respectively. Table Summary of the pairwise comparison CVaEA v.s. CNSGAIII CMOEAD/D nb 8/3 7/3 ne /3 /3 nw 7/3 /3 In order to present the distribution of solutions visually, Fig. plots the final solutions of one run with respect to three-objective problems. This run is associated with the particular run that obtains the closest result to the median value of IGD. For C-SDTLZ, as shown in Fig. (a), (b) and (c), the solutions obtained by CVaEA and CNSGAIII cover the whole Pareto front well, while those found by CMOEAD/D
15 CVaEA seem to concentrate in only a small part of the optimal front. Hence, for CMOEAD/D, the IGD value naturally increases. It can be seen from Fig. (d), (e) and (f) that all the three algorithms have difficulty in solving the C-SDTLZ3 problem. The quality of the approximation sets is not satisfactory in terms of both the convergence and diversity. Relatively speaking, the front found by CMOEAD/D has better convergence than those obtained by CVaEA and CNSGAIII. The solutions of CVaEA and CNSGAIII are distributed similarly. The C-SDTLZ3 is a very hard problem for many evolutionary algorithms [3], because it introduces a band of infeasible region around the Pareto-optimal front providing great difficulties for an algorithm to overcome. Pareto front of problem CSDTLZ Pareto front of problem CSDTLZ Pareto front of problem CSDTLZ f3 f3 f3.... f f (a) CVaEA. f f (b) CNSGAIII. f f (c) CMOEA/DD Pareto front of problem CSDTLZ3 Pareto front of problem CSDTLZ3 Pareto front of problem CSDTLZ3 4 f3 f3 f3 f f (d) CVaEA f f (e) CNSGAIII f f (f) CMOEA/DD Fig. The final solution set of the three algorithms on the three-objective problems (C-SDTLZ and C-SDTLZ3). Fig. 3 plots, by parallel coordinates, the final solutions for the eight-objective C- SDTLZ and the fifteen-objective C3-SDTLZ4. In these figures, the objectives are divided by the scaling factor r i to get identical ranges for each objective which is helpful for a better presentation of the distribution of solutions. For the eight-objective C-SDTLZ, as shown in Fig. 3 (c), the solutions obtained by CMOEAD/D have the worst convergence, while those found by CVaEA and CNSGAIII converge similarly but indeed have differences in terms of the distribution [see Fig. 3 (a) and (b)]. It seems that the solutions of CNSGAIII are distributed more uniformly than those of CVaEA, however, CNSGAIII finds repeated values for each objective, which will cause information redundancies [8]. Similar observations are found on the fifteenobjective C3-SDTLZ4 problem. It can be seen that our proposed algorithm is able to obtain a set of well converged and appropriately distributed solutions [Fig. 3 (d)], while CNSGAIII still struggles to find enough different values for each objective to cover the optimal front well [Fig. 3 (e)]. For CMOEA/DD, some objectives, e.g., the 9th to th objectives, are not well covered by the solutions [Fig. 3 (f)].
16 6 Y. Xiang et al..7 CVaEA on CSDTLZ.7 CNSGAIII on CSDTLZ 4 CMOEA/DD on CSDTLZ f i /3 i.4.3 f i /3 i.4.3 f i /3 i Objective No. (a) CVaEA Objective No. (b) CNSGAIII Objective No. (c) CMOEA/DD CVaEA on C3SDTLZ4. CNSGAIII on C3SDTLZ4. CMOEA/DD on C3SDTLZ4. f i / i f i / i. f i / i Objective No. (d) CVaEA Objective No. (e) CNSGAIII Objective No. (f) CMOEA/DD Fig. 3 The final solution set of the three algorithms on the eight-objective C-SDTLZ and the fifteenobjective C3-SDTLZ4, shown by parallel coordinates. 3. The Effect of the Parameter α In CVaEA, a parameter, the infeasible portion (α) is utilized to control the ratio of infeasible solutions that are preferentially added into the population in the environmental selection. This section investigates the effect of α. Here, we show the results for the ten- and fifteen-objective problems. Similar results can be obtained for test problems with other numbers of objectives. 9 8 M= M= Average Rankings (IGD) α Fig. 4 The curve of the average rankings when α varies from. to. with a step size.. To study the sensitivity of our algorithm to α, we repeat the experiments conducted in the previous section for α [.,.] with a step size.. All the other
17 CVaEA M= M= Average Rankings (IGD) α Fig. The curve of the average rankings when α varies from. to. with a step size.. parameters are kept the same as described in the previous section. For each value of α, the average ranking obtained by applying the Friedman test [] is used to measure the performance of the algorithm. Fig. 4 shows the results of average rankings for the ten- and fifteen-objective problems. It is clear from the figure that the average rankings, in general, tend to increase when α is larger than., meaning that CVaEA seems to prefer smaller values of α. Compared with α =., the algorithm achieves better average rankings when α is set to., indicating that the inclusion of a certain small portion of infeasible solutions really contributes to the performance improvement of the algorithm. It seems that better settings of α can be found in the interval [.,.]. To carry out a fine tuning of the parameter, the above experiment is repeated by changing the value of α from. to. with a step size.. The curve of average rankings is shown in Fig., where the best values of α are. and. for problems with ten and fifteen objectives, respectively. Considering the overall performance, a value between [.,.] is here suggested for an unknown optimization problem. 4 Practical Application of the CVaEA Having shown the ability of CVaEA in solving various kinds of constrained test problems, the CVaEA is now applied to an engineering constrained optimization problem, the Water problem that has five objectives and seven constraints. 4. A Brief Introduction to the Problem This is a problem taken from [], [3]. The problem has three decision variables x, x and x 3. All the variables have a lower boundary., and the upper boundary
18 8 Y. Xiang et al. is.4 for x, and. for x and x 3. The five objective functions are given as follows. All objectives are to be minimized. f (x) = (x + x 3 ) f (x) = 3x f 3 (x) = x /(.6 89).6 f 4 (x) = 89 e 39.7x +9.9x f (x) = (.39/x x + 494x 3 8) The seven constraints are formulated as below. g (x) =.39/(x x ) x 3.8 g (x) =.36/(x x ) +.8x g 3 (x) =.37/(x x ) x g 4 (x) =.98/(x x ) x g (x) =.38/(x x ) x g 6 (x) =.47(x x ) + 7.6x g 7 (x) =.64/(x x ) x (7) (8) 4. Results on the Water Problem To measure the distribution of the obtained solutions, we here introduce a new metric named the generalized spread (denoted as SPREAD) which is defined as follows [36]. M i= SP READ(P ) = dist(e i, P ) + z P dist(z, P ) d M i= dist(e, (9) i, P ) + P d where P is a set of solutions, P is the set of Pareto optimal solutions, e, e,..., e M are M extreme solutions in P and dist(z, P ) = d = P min F(z) y P,y z F(y), () x P dist(x, P ). () Another metric, the generational distance (GD), is introduced to measure how far are the points in the approximate set from those in the optimal Pareto front []. It is a measurement of the convergence of the obtained solutions. Experimental results of all the metrics (SPREAD, GD and IGD) are tabulated in Table. Note that when calculating the above metrics, a set of reference points approximating the true Pareto front is needed. For the Water problem, the reference points are generated by combining all non-dominated solutions found by all the algorithms over all the runs.
19 CVaEA 9 CVaEA CNSGAIII f x 6 x x 6 x x 4 x x x 6 f x 4 x 6 4 x 4 4 x x 6 x 4 x 6 f3 4 4 x 4 x 6 4 x 6 x x 6 x 4 x 6 4 f4 x 6 x 6 4 x x 4 x 4 4 x x 4 x 6 4 x 4 f x 6 Fig. 6 Scatter plot showing the results of CVaEA and CNSGAIII (top-right plots) and the results of the optimal front (bottom-left plots). Table Results (median and IQR) of CVaEA, CNSGAIII and CMOEA/DD on the Water problem. CVaEA CNSGAIII CMOEA/DD SPREAD 3.4E (.8E ) 6.49E (.E ) 4.889E (6.E ) GD 4.94E 3 (6.3E 4) 3.746E 3 (6.7E 4) 4.894E (8.E 3) IGD 3.6E + 4 (7.6E + 3) 3.4E + 4 (6.E + 3) 7.73E + 4 (6.9E + 3) As shown in Table, the CVaEA performs best in terms of the SPREAD metric, followed by CMOEA/DD. Contrarily, CNSGAIII gives relatively poor performance with respect to the distribution of the solutions. However, as demonstrated by the GD indicator, CNSGAIII can converge to the optimal Pareto front very well, greatly outperforming the CMOEA/DD algorithm. Compared with CNSGAIII, our proposed CVaEA obtains similar convergence. Finally, as shown by the IGD results, CVaEA is the most competitive algorithm, showing its ability of keeping a good balance between the convergence and diversity. Figs. 6 and 7 show the solutions of all the three algorithms in the scatter matrix plot. In these figures, the lower left plots present results of the optimal Pareto front, while the upper right plots are for the constrained algorithms. For the convenience of comparison, the (i, j)th (i > j) plot should be compared with that in the position (j, i). It is shown in Fig. 6 that solutions obtained by CVaEA are widely distributed
20 Y. Xiang et al. CVaEA CMOEA/DD f x 6 x x 6 x x 4 x x x 6 f x 4 x 6 4 x 4 4 x x 6 x 4 x 6 f3 4 4 x 4 x 6 4 x 6 x x 7 x 4 x 7 4 f4 x 6 x 6 x 6 8 x x 4 x 4 x x 6 4 x 4 f x 7 Fig. 7 Scatter plot showing the results of CVaEA and CMOEA/DD (top-right plots) and the results of the optimal front (bottom-left plots). on the entire Pareto-optimal front, achieving a better distribution than those found by CNSGAIII. It is observed, in some plots [e.g., the (,4)th plot], that solutions of CNSGAIII don t necessarily cover the entire optimal front. Fig. 7 presents the comparison of solutions obtained by CVaEA and CMOEA/DD. Clearly, solutions of CVaEA converge towards the Pareto-optimal front significantly better than those of CMOEA/DD. For some plots, e.g., the (,4)th, (,)th and (4,)th plots, some extremely poor converged solutions are found by the CMOEA/DD. This phenomenon may be attributed to the mechanism adopted in CMOEA/DD that isolated solutions are always kept for the promotion of diversity, however, these solutions may be far away from the optimal front. Hence, the CMOEA/DD improves the solutions distribution at risk of harming their convergence. To compare the running speed of all the algorithms, we record the actual running time for each run and each algorithm in milliseconds in the same platform (Intel (R) Core (TM) i-u, GHz with 8. GB RAM). The median of runtime over runs is 6.3E+4, 7.397E+4 and.88e+4 for CVaEA, CNS- GAIII and CMOEA/DD, respectively. Fig. 8 shows the intuitive comparison of the runtime, where the time of all the algorithms is normalized by dividing the time of CMOEA/DD. Obviously, CMOEA/DD is the fastest algorithm, followed by CVaEA, and finally the CNSGAIII algorithm. Precisely, the runtime of CVaEA is only 8.%
21 CVaEA of that of CNSGAIII. The CMOEA/DD is faster than CVaEA and CNSGAIII, and this may due to the fact that CMOEA/DD doesn t normalize the population, and it also uses an efficient method to update the non-domination level structure []. Although efficient is the CMOEA/DD, it isn t effective in solving problems in terms of the quality of the solutions, especially the convergence. 3. Runtime.. CVaEA CNSGAIII CMOEA/DD Fig. 8 The comparison of runtime on the Water problem. Finally, we conclude that our proposed CVaEA would be the best choice when handling some engineering problems, such as the Water problem, mainly because of its ability of efficiently finding a set of well converged and properly distributed solutions. Conclusion In the previous study, by using the concept of vector angles, the VaEA was proposed for dealing with unconstrained many-objective optimization problems. In this paper, we extend VaEA to CVaEA by making modifications to the mating and environmental selection processes so as to handle constraints. In CVaEA, the information provided by infeasible solutions are sufficiently utilized, and the algorithm puts more emphases on feasible solutions over infeasible ones, and on smaller-cv solutions over larger- CV ones. The balance between convergence and diversity is achieved by two principles: the maximum-vector-angle-first principle and the modified worse-elimination principle. To test the performance of our proposed CVaEA, a set of new constrained manyobjective test problems is designed by multiplying each objective in the constrained DTLZ problems by different factors. Thus, the ranges of values for each objective are different, which reflects the real situations more exactly because objectives of a practical engineering problem are usually distributed in different ranges. The simulation results of CVaEA, together with CNAGAIII and CMOEA/DD, on six test problems with up to objectives, have shown the superiority of our proposed method in finding a set of well converged solutions while maintaining an extensive distribution among them.
Effectiveness and efficiency of non-dominated sorting for evolutionary multi- and many-objective optimization
Complex Intell. Syst. (217) 3:247 263 DOI 1.17/s4747-17-57-5 ORIGINAL ARTICLE Effectiveness and efficiency of non-dominated sorting for evolutionary multi- and many-objective optimization Ye Tian 1 Handing
More informationMULTI-OBJECTIVE optimization problems (MOPs)
IEEE TRANSACTIONS ON CYBERNETICS 1 A Scalar Projection and Angle-Based Evolutionary Algorithm for Many-Objective Optimization Problems Yuren Zhou, Yi Xiang, Zefeng Chen,Jun He,Senior Member, IEEE, and
More informationTwo-Archive Evolutionary Algorithm for Constrained Multi-Objective Optimization
Two-Archive Evolutionary Algorithm for Constrained Multi-Objective Optimization Ke Li #, Renzhi Chen #2, Guangtao Fu 3 and Xin Yao 2 Department of Computer Science, University of Exeter 2 CERCIA, School
More informationPerformance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances
Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances Minzhong Liu, Xiufen Zou, Yu Chen, Zhijian Wu Abstract In this paper, the DMOEA-DD, which is an improvement of DMOEA[1,
More informationUsing ɛ-dominance for Hidden and Degenerated Pareto-Fronts
IEEE Symposium Series on Computational Intelligence Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts Heiner Zille Institute of Knowledge and Language Engineering University of Magdeburg, Germany
More informationUsing an outward selective pressure for improving the search quality of the MOEA/D algorithm
Comput Optim Appl (25) 6:57 67 DOI.7/s589-5-9733-9 Using an outward selective pressure for improving the search quality of the MOEA/D algorithm Krzysztof Michalak Received: 2 January 24 / Published online:
More informationEvolutionary Computation
Evolutionary Computation Lecture 9 Mul+- Objec+ve Evolu+onary Algorithms 1 Multi-objective optimization problem: minimize F(X) = ( f 1 (x),..., f m (x)) The objective functions may be conflicting or incommensurable.
More informationEfficient Non-domination Level Update Approach for Steady-State Evolutionary Multiobjective Optimization
Efficient Non-domination Level Update Approach for Steady-State Evolutionary Multiobjective Optimization Ke Li 1, Kalyanmoy Deb 1, Qingfu Zhang 2, and Sam Kwong 2 1 Department of Electrical and Computer
More informationA Search Method with User s Preference Direction using Reference Lines
A Search Method with User s Preference Direction using Reference Lines Tomohiro Yoshikawa Graduate School of Engineering, Nagoya University, Nagoya, Japan, {yoshikawa}@cse.nagoya-u.ac.jp Abstract Recently,
More informationMULTI-objective optimization problems (MOPs),
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO., MONTH YEAR An Indicator Based Multi-Objective Evolutionary Algorithm with Reference Point Adaptation for Better Versatility Ye Tian, Ran Cheng,
More informationInvestigating the Effect of Parallelism in Decomposition Based Evolutionary Many-Objective Optimization Algorithms
Investigating the Effect of Parallelism in Decomposition Based Evolutionary Many-Objective Optimization Algorithms Lei Chen 1,2, Kalyanmoy Deb 2, and Hai-Lin Liu 1 1 Guangdong University of Technology,
More informationLamarckian Repair and Darwinian Repair in EMO Algorithms for Multiobjective 0/1 Knapsack Problems
Repair and Repair in EMO Algorithms for Multiobjective 0/ Knapsack Problems Shiori Kaige, Kaname Narukawa, and Hisao Ishibuchi Department of Industrial Engineering, Osaka Prefecture University, - Gakuen-cho,
More informationBenchmarking Multi- and Many-objective Evolutionary Algorithms under Two Optimization Scenarios
1 Benchmarking Multi- and Many-objective Evolutionary Algorithms under Two Optimization Scenarios Ryoji Tanabe, Member, IEEE, and Hisao Ishibuchi, Fellow, IEEE and Akira Oyama, Member, IEEE, Abstract Recently,
More informationMULTI-objective optimization problems (MOPs) are. A Many-Objective Evolutionary Algorithm Using A One-by-One Selection Strategy
JOURNAL OF L A TEX CLASS FILES, VOL. 4, NO. 8, AUGUST 05 A Many-Objective Evolutionary Algorithm Using A One-by-One Selection Strategy Yiping Liu, Dunwei Gong, Jing Sun, and Yaochu Jin, Fellow, IEEE Abstract
More informationApproximation Model Guided Selection for Evolutionary Multiobjective Optimization
Approximation Model Guided Selection for Evolutionary Multiobjective Optimization Aimin Zhou 1, Qingfu Zhang 2, and Guixu Zhang 1 1 Each China Normal University, Shanghai, China 2 University of Essex,
More informationFinding Sets of Non-Dominated Solutions with High Spread and Well-Balanced Distribution using Generalized Strength Pareto Evolutionary Algorithm
16th World Congress of the International Fuzzy Systems Association (IFSA) 9th Conference of the European Society for Fuzzy Logic and Technology (EUSFLAT) Finding Sets of Non-Dominated Solutions with High
More informationMulti-objective Optimization
Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001 Multi-objective Optimization Implementation of Constrained GA Based on NSGA-II Optimization
More informationTHE CONSTRAINED multi-objective optimization problem
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION Two-Archive Evolutionary Algorithm for Constrained Multi-Objective Optimization Ke Li # Member, IEEE, Renzhi Chen #, Guangtao Fu, and Xin Yao, Fellow, IEEE
More informationIEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO., MONTH YEAR 1
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO., MONTH YEAR 1 An Efficient Approach to Non-dominated Sorting for Evolutionary Multi-objective Optimization Xingyi Zhang, Ye Tian, Ran Cheng, and
More informationMulti-objective Optimization
Jugal K. Kalita Single vs. Single vs. Single Objective Optimization: When an optimization problem involves only one objective function, the task of finding the optimal solution is called single-objective
More informationLocating the boundaries of Pareto fronts: A Many-Objective Evolutionary Algorithm Based on Corner Solution Search
Locating the boundaries of Pareto fronts: A Many-Objective Evolutionary Algorithm Based on Corner Solution Search Xinye Cai, Haoran Sun, Chunyang Zhu, Zhenyu Li, Qingfu Zhang arxiv:8.297v [cs.ai] 8 Jun
More informationEvolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.
Evolutionary Algorithms: Lecture 4 Jiří Kubaĺık Department of Cybernetics, CTU Prague http://labe.felk.cvut.cz/~posik/xe33scp/ pmulti-objective Optimization :: Many real-world problems involve multiple
More informationAn Evolutionary Algorithm for the Multi-objective Shortest Path Problem
An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China
More informationCHAPTER 6 REAL-VALUED GENETIC ALGORITHMS
CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS 6.1 Introduction Gradient-based algorithms have some weaknesses relative to engineering optimization. Specifically, it is difficult to use gradient-based algorithms
More informationBalancing Survival of Feasible and Infeasible Solutions in Evolutionary Optimization Algorithms
Balancing Survival of Feasible and Infeasible Solutions in Evolutionary Optimization Algorithms Zhichao Lu,, Kalyanmoy Deb, and Hemant Singh Electrical and Computer Engineering Michigan State University,
More informationEvolutionary multi-objective algorithm design issues
Evolutionary multi-objective algorithm design issues Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology Karthik.sindhya@jyu.fi
More informationUsing Different Many-Objective Techniques in Particle Swarm Optimization for Many Objective Problems: An Empirical Study
International Journal of Computer Information Systems and Industrial Management Applications ISSN 2150-7988 Volume 3 (2011) pp.096-107 MIR Labs, www.mirlabs.net/ijcisim/index.html Using Different Many-Objective
More informationDEMO: Differential Evolution for Multiobjective Optimization
DEMO: Differential Evolution for Multiobjective Optimization Tea Robič and Bogdan Filipič Department of Intelligent Systems, Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana, Slovenia tea.robic@ijs.si
More informationIncorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms
H. Ishibuchi, T. Doi, and Y. Nojima, Incorporation of scalarizing fitness functions into evolutionary multiobjective optimization algorithms, Lecture Notes in Computer Science 4193: Parallel Problem Solving
More informationMechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA
Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA Kalyanmoy Deb, Amrit Pratap, and Subrajyoti Moitra Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute
More informationLate Parallelization and Feedback Approaches for Distributed Computation of Evolutionary Multiobjective Optimization Algorithms
Late arallelization and Feedback Approaches for Distributed Computation of Evolutionary Multiobjective Optimization Algorithms O. Tolga Altinoz Department of Electrical and Electronics Engineering Ankara
More informationModule 1 Lecture Notes 2. Optimization Problem and Model Formulation
Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization
More informationComparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach
Comparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach Hisao Ishibuchi Graduate School of Engineering Osaka Prefecture University Sakai, Osaka 599-853,
More informationAn Improved Progressively Interactive Evolutionary Multi-objective Optimization Algorithm with a Fixed Budget of Decision Maker Calls
An Improved Progressively Interactive Evolutionary Multi-objective Optimization Algorithm with a Fixed Budget of Decision Maker Calls Ankur Sinha, Pekka Korhonen, Jyrki Wallenius Firstname.Secondname@aalto.fi,
More informationDifficulty Adjustable and Scalable Constrained Multi-objective Test Problem Toolkit
JOURNAL OF LATEX CLASS FILES, VOL. 4, NO. 8, AUGUST 5 Difficulty Adjustable and Scalable Constrained Multi-objective Test Problem Toolkit Zhun Fan, Senior Member, IEEE, Wenji Li, Xinye Cai, Hui Li, Caimin
More informationAdaptive Reference Vector Generation for Inverse Model Based Evolutionary Multiobjective Optimization with Degenerate and Disconnected Pareto Fronts
Adaptive Reference Vector Generation for Inverse Model Based Evolutionary Multiobjective Optimization with Degenerate and Disconnected Pareto Fronts Ran Cheng, Yaochu Jin,3, and Kaname Narukawa 2 Department
More informationR2-IBEA: R2 Indicator Based Evolutionary Algorithm for Multiobjective Optimization
R2-IBEA: R2 Indicator Based Evolutionary Algorithm for Multiobjective Optimization Dũng H. Phan Department of Computer Science University of Massachusetts, Boston Boston, MA 02125, USA Email: phdung@cs.umb.edu
More informationIN the real world, it is not uncommon to face a problem. A Grid-Based Evolutionary Algorithm for Many-Objective Optimization
1 A Grid-Based Evolutionary Algorithm for Many-Objective Optimization Shengxiang Yang, Member, IEEE, Miqing Li, Xiaohui Liu, and Jinhua Zheng Abstract Balancing convergence and diversity plays a key role
More informationAn Efficient Constraint Handling Method for Genetic Algorithms
An Efficient Constraint Handling Method for Genetic Algorithms Kalyanmoy Deb Kanpur Genetic Algorithms Laboratory (KanGAL) Department of Mechanical Engineering Indian Institute of Technology Kanpur Kanpur,
More informationA GENETIC ALGORITHM APPROACH FOR TECHNOLOGY CHARACTERIZATION. A Thesis EDGAR GALVAN
A GENETIC ALGORITHM APPROACH FOR TECHNOLOGY CHARACTERIZATION A Thesis by EDGAR GALVAN Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for
More informationA Similarity-Based Mating Scheme for Evolutionary Multiobjective Optimization
A Similarity-Based Mating Scheme for Evolutionary Multiobjective Optimization Hisao Ishibuchi and Youhei Shibata Department of Industrial Engineering, Osaka Prefecture University, - Gakuen-cho, Sakai,
More informationLecture
Lecture.. 7 Constrained problems & optimization Brief introduction differential evolution Brief eample of hybridization of EAs Multiobjective problems & optimization Pareto optimization This slides mainly
More informationIEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, VOL. XX, NO. X, XXXX XXXX 1
IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, VOL. XX, NO. X, XXXX XXXX Evolutionary Many-objective Optimization of Hybrid Electric Vehicle Control: From General Optimization to Preference
More informationAn Evolutionary Algorithm with Advanced Goal and Priority Specification for Multi-objective Optimization
Journal of Artificial Intelligence Research 8 (2003) 83-25 Submitted 9/0; published 2/03 An Evolutionary Algorithm with Advanced Goal and Priority Specification for Multi-objective Optimization Kay Chen
More informationA Clustering Multi-objective Evolutionary Algorithm Based on Orthogonal and Uniform Design
A Clustering Multi-objective Evolutionary Algorithm Based on Orthogonal and Uniform Design Yuping Wang, Chuangyin Dang, Hecheng Li, Lixia Han and Jingxuan Wei Abstract Designing efficient algorithms for
More informationA Framework for Large-scale Multi-objective Optimization based on Problem Transformation
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, MAY 7 A Framework for Large-scale Multi-objective Optimization based on Problem Transformation Heiner Zille, Hisao Ishibuchi, Sanaz Mostaghim and Yusuke Nojima
More informationExperimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization
Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization adfa, p. 1, 2011. Springer-Verlag Berlin Heidelberg 2011 Devang Agarwal and Deepak Sharma Department of Mechanical
More informationAn Interactive Evolutionary Multi-Objective Optimization Method Based on Progressively Approximated Value Functions
An Interactive Evolutionary Multi-Objective Optimization Method Based on Progressively Approximated Value Functions Kalyanmoy Deb, Ankur Sinha, Pekka Korhonen, and Jyrki Wallenius KanGAL Report Number
More informationMetaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini
Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution
More informationMOEA/D with NBI-style Tchebycheff approach for Portfolio Management
WCCI 2010 IEEE World Congress on Computational Intelligence July, 18-23, 2010 - CCIB, Barcelona, Spain CEC IEEE with NBI-style Tchebycheff approach for Portfolio Management Qingfu Zhang, Hui Li, Dietmar
More informationImproved Crowding Distance for NSGA-II
Improved Crowding Distance for NSGA-II Xiangxiang Chu and Xinjie Yu Department of Electrical Engineering, Tsinghua University, Beijing84, China Abstract:Non-dominated sorting genetic algorithm II (NSGA-II)
More informationAn Evolutionary Multi-Objective Crowding Algorithm (EMOCA): Benchmark Test Function Results
Syracuse University SURFACE Electrical Engineering and Computer Science College of Engineering and Computer Science -0-005 An Evolutionary Multi-Objective Crowding Algorithm (EMOCA): Benchmark Test Function
More informationA Distance Metric for Evolutionary Many-Objective Optimization Algorithms Using User-Preferences
A Distance Metric for Evolutionary Many-Objective Optimization Algorithms Using User-Preferences Upali K. Wickramasinghe and Xiaodong Li School of Computer Science and Information Technology, RMIT University,
More informationMulti-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design
City University of New York (CUNY) CUNY Academic Works International Conference on Hydroinformatics 8-1-2014 Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design Matthew
More informationImproved S-CDAS using Crossover Controlling the Number of Crossed Genes for Many-objective Optimization
Improved S-CDAS using Crossover Controlling the Number of Crossed Genes for Many-objective Optimization Hiroyuki Sato Faculty of Informatics and Engineering, The University of Electro-Communications -5-
More informationA Surrogate-assisted Reference Vector Guided Evolutionary Algorithm for Computationally Expensive Many-objective Optimization
A Surrogate-assisted Reference Vector Guided Evolutionary Algorithm for Computationally Expensive Many-objective Optimization Tinkle Chugh, Yaochu Jin, Fellow, IEEE, Kaisa Miettinen, Jussi Hakanen, Karthik
More informationRecombination of Similar Parents in EMO Algorithms
H. Ishibuchi and K. Narukawa, Recombination of parents in EMO algorithms, Lecture Notes in Computer Science 341: Evolutionary Multi-Criterion Optimization, pp. 265-279, Springer, Berlin, March 25. (Proc.
More informationREAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN
REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION Nedim TUTKUN nedimtutkun@gmail.com Outlines Unconstrained Optimization Ackley s Function GA Approach for Ackley s Function Nonlinear Programming Penalty
More informationWhat is GOSET? GOSET stands for Genetic Optimization System Engineering Tool
Lecture 5: GOSET 1 What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool GOSET is a MATLAB based genetic algorithm toolbox for solving optimization problems 2 GOSET Features Wide
More informationSPEA2+: Improving the Performance of the Strength Pareto Evolutionary Algorithm 2
SPEA2+: Improving the Performance of the Strength Pareto Evolutionary Algorithm 2 Mifa Kim 1, Tomoyuki Hiroyasu 2, Mitsunori Miki 2, and Shinya Watanabe 3 1 Graduate School, Department of Knowledge Engineering
More informationEVOLUTIONARY algorithms (EAs) are a class of
An Investigation on Evolutionary Gradient Search for Multi-objective Optimization C. K. Goh, Y. S. Ong and K. C. Tan Abstract Evolutionary gradient search is a hybrid algorithm that exploits the complementary
More informationCHAPTER 2 MULTI-OBJECTIVE REACTIVE POWER OPTIMIZATION
19 CHAPTER 2 MULTI-OBJECTIE REACTIE POWER OPTIMIZATION 2.1 INTRODUCTION In this chapter, a fundamental knowledge of the Multi-Objective Optimization (MOO) problem and the methods to solve are presented.
More informationEscaping Local Optima: Genetic Algorithm
Artificial Intelligence Escaping Local Optima: Genetic Algorithm Dae-Won Kim School of Computer Science & Engineering Chung-Ang University We re trying to escape local optima To achieve this, we have learned
More informationAn Evolutionary Algorithm Approach to Generate Distinct Sets of Non-Dominated Solutions for Wicked Problems
An Evolutionary Algorithm Approach to Generate Distinct Sets of Non-Dominated Solutions for Wicked Problems Marcio H. Giacomoni Assistant Professor Civil and Environmental Engineering February 6 th 7 Zechman,
More informationAssessing the Convergence Properties of NSGA-II for Direct Crashworthiness Optimization
10 th International LS-DYNA Users Conference Opitmization (1) Assessing the Convergence Properties of NSGA-II for Direct Crashworthiness Optimization Guangye Li 1, Tushar Goel 2, Nielen Stander 2 1 IBM
More informationFinding a preferred diverse set of Pareto-optimal solutions for a limited number of function calls
Finding a preferred diverse set of Pareto-optimal solutions for a limited number of function calls Florian Siegmund, Amos H.C. Ng Virtual Systems Research Center University of Skövde P.O. 408, 541 48 Skövde,
More informationAn Optimality Theory Based Proximity Measure for Set Based Multi-Objective Optimization
An Optimality Theory Based Proximity Measure for Set Based Multi-Objective Optimization Kalyanmoy Deb, Fellow, IEEE and Mohamed Abouhawwash Department of Electrical and Computer Engineering Computational
More informationGenerating Uniformly Distributed Pareto Optimal Points for Constrained and Unconstrained Multicriteria Optimization
Generating Uniformly Distributed Pareto Optimal Points for Constrained and Unconstrained Multicriteria Optimization Crina Grosan Department of Computer Science Babes-Bolyai University Cluj-Napoca, Romania
More informationA Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen 2, b
International Conference on Information Technology and Management Innovation (ICITMI 2015) A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen
More informationTowards Understanding Evolutionary Bilevel Multi-Objective Optimization Algorithm
Towards Understanding Evolutionary Bilevel Multi-Objective Optimization Algorithm Ankur Sinha and Kalyanmoy Deb Helsinki School of Economics, PO Box, FIN-, Helsinki, Finland (e-mail: ankur.sinha@hse.fi,
More informationAuthor s Accepted Manuscript
Author s Accepted Manuscript On The Use of Two Reference Points in Decomposition Based Multiobjective Evolutionary Algorithms Zhenkun Wang, Qingfu Zhang, Hui Li, Hisao Ishibuchi, Licheng Jiao www.elsevier.com/locate/swevo
More informationEliteNSGA-III: An Improved Evolutionary Many- Objective Optimization Algorithm
EliteNSGA-III: An Improved Evolutionary Many- Objective Optimization Algorithm Amin Ibrahim, IEEE Member Faculty of Electrical, Computer, and Software Engineering University of Ontario Institute of Technology
More informationHeuristic Optimisation
Heuristic Optimisation Part 10: Genetic Algorithm Basics Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic
More informationSolving Multi-objective Optimisation Problems Using the Potential Pareto Regions Evolutionary Algorithm
Solving Multi-objective Optimisation Problems Using the Potential Pareto Regions Evolutionary Algorithm Nasreddine Hallam, Graham Kendall, and Peter Blanchfield School of Computer Science and IT, The Univeristy
More informationIncorporating Decision-Maker Preferences into the PADDS Multi- Objective Optimization Algorithm for the Design of Water Distribution Systems
Incorporating Decision-Maker Preferences into the PADDS Multi- Objective Optimization Algorithm for the Design of Water Distribution Systems Bryan A. Tolson 1, Mohammadamin Jahanpour 2 1,2 Department of
More informationExploration of Pareto Frontier Using a Fuzzy Controlled Hybrid Line Search
Seventh International Conference on Hybrid Intelligent Systems Exploration of Pareto Frontier Using a Fuzzy Controlled Hybrid Line Search Crina Grosan and Ajith Abraham Faculty of Information Technology,
More informationMulti-Objective Memetic Algorithm using Pattern Search Filter Methods
Multi-Objective Memetic Algorithm using Pattern Search Filter Methods F. Mendes V. Sousa M.F.P. Costa A. Gaspar-Cunha IPC/I3N - Institute of Polymers and Composites, University of Minho Guimarães, Portugal
More informationAdaptive Operator Selection With Bandits for a Multiobjective Evolutionary Algorithm Based on Decomposition
114 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 18, NO. 1, FEBRUARY 2014 Adaptive Operator Selection With Bandits for a Multiobjective Evolutionary Algorithm Based on Decomposition Ke Li, Student
More informationCHAPTER 6 MODIFIED FUZZY TECHNIQUES BASED IMAGE SEGMENTATION
CHAPTER 6 MODIFIED FUZZY TECHNIQUES BASED IMAGE SEGMENTATION 6.1 INTRODUCTION Fuzzy logic based computational techniques are becoming increasingly important in the medical image analysis arena. The significant
More informationTHE CAPACITATED arc routing problem (CARP) [1] is a
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 15, NO. 2, APRIL 2011 151 Decomposition-Based Memetic Algorithm for Multiobjective Capacitated Arc Routing Problem Yi Mei, Student Member, IEEE, Ke Tang,
More informationTowards an Estimation of Nadir Objective Vector Using Hybrid Evolutionary and Local Search Approaches
Towards an Estimation of Nadir Objective Vector Using Hybrid Evolutionary and Local Search Approaches Kalyanmoy Deb, Kaisa Miettinen, and Shamik Chaudhuri KanGAL Report Number 279 Abstract Nadir objective
More informationMetaheuristic Optimization with Evolver, Genocop and OptQuest
Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:
More informationminimizing minimizing
The Pareto Envelope-based Selection Algorithm for Multiobjective Optimization David W. Corne, Joshua D. Knowles, Martin J. Oates School of Computer Science, Cybernetics and Electronic Engineering University
More informationAn Efficient Solution Strategy for Bilevel Multiobjective Optimization Problems Using Multiobjective Evolutionary Algorithms
An Efficient Solution Strategy for Bilevel Multiobjective Optimization Problems Using Multiobjective Evolutionary Algorithms Hong Li a,, Li Zhang a, Qingfu Zhang b, Qin Chen c a School of Mathematics and
More informationAsoftware development process typically consists of four
IEEE TRANSACTIONS ON RELIABILITY, VOL 59, NO 3, SEPTEMBER 2010 563 Multi-Objective Approaches to Optimal Testing Resource Allocation in Modular Software Systems Zai Wang, Student Member, IEEE, Ke Tang,
More informationEvolutionary Algorithms
Evolutionary Algorithms Proposal for a programming project for INF431, Spring 2014 version 14-02-19+23:09 Benjamin Doerr, LIX, Ecole Polytechnique Difficulty * *** 1 Synopsis This project deals with the
More informationCS5401 FS2015 Exam 1 Key
CS5401 FS2015 Exam 1 Key This is a closed-book, closed-notes exam. The only items you are allowed to use are writing implements. Mark each sheet of paper you use with your name and the string cs5401fs2015
More informationTelecommunication and Informatics University of North Carolina, Technical University of Gdansk Charlotte, NC 28223, USA
A Decoder-based Evolutionary Algorithm for Constrained Parameter Optimization Problems S lawomir Kozie l 1 and Zbigniew Michalewicz 2 1 Department of Electronics, 2 Department of Computer Science, Telecommunication
More informationGenetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem
etic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem R. O. Oladele Department of Computer Science University of Ilorin P.M.B. 1515, Ilorin, NIGERIA
More informationAdjusting Parallel Coordinates for Investigating Multi-Objective Search
Adjusting Parallel Coordinates for Investigating Multi-Objective Search Liangli Zhen,, Miqing Li, Ran Cheng, Dezhong Peng and Xin Yao 3, Machine Intelligence Laboratory, College of Computer Science, Sichuan
More informationInternational Conference on Computer Applications in Shipbuilding (ICCAS-2009) Shanghai, China Vol.2, pp
AUTOMATIC DESIGN FOR PIPE ARRANGEMENT CONSIDERING VALVE OPERATIONALITY H Kimura, Kyushu University, Japan S Iehira, Kyushu University, Japan SUMMARY We propose a novel evaluation method of valve operationality
More informationNeural Network Weight Selection Using Genetic Algorithms
Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks
More informationOptimizing Delivery Time in Multi-Objective Vehicle Routing Problems with Time Windows
Optimizing Delivery Time in Multi-Objective Vehicle Routing Problems with Time Windows Abel Garcia-Najera and John A. Bullinaria School of Computer Science, University of Birmingham Edgbaston, Birmingham
More informationEvolutionary Multi-objective Optimization of Business Process Designs with Pre-processing
Evolutionary Multi-objective Optimization of Business Process Designs with Pre-processing Kostas Georgoulakos Department of Applied Informatics University of Macedonia Thessaloniki, Greece mai16027@uom.edu.gr
More informationOvercompressing JPEG images with Evolution Algorithms
Author manuscript, published in "EvoIASP2007, Valencia : Spain (2007)" Overcompressing JPEG images with Evolution Algorithms Jacques Lévy Véhel 1, Franklin Mendivil 2 and Evelyne Lutton 1 1 Inria, Complex
More informationPseudo-code for typical EA
Extra Slides for lectures 1-3: Introduction to Evolutionary algorithms etc. The things in slides were more or less presented during the lectures, combined by TM from: A.E. Eiben and J.E. Smith, Introduction
More informationCHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM
20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:
More informationETEA: A Euclidean Minimum Spanning Tree-Based Evolutionary Algorithm for Multi-Objective Optimization
ETEA: A Euclidean Minimum Spanning Tree-Based Evolutionary Algorithm for Multi-Objective Optimization Miqing Li miqing.li@brunel.ac.uk Department of Information Systems and Computing, Brunel University,
More informationThe Binary Genetic Algorithm. Universidad de los Andes-CODENSA
The Binary Genetic Algorithm Universidad de los Andes-CODENSA 1. Genetic Algorithms: Natural Selection on a Computer Figure 1 shows the analogy between biological i l evolution and a binary GA. Both start
More informationIEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 12, NO. 4, AUGUST
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 12, NO. 4, AUGUST 2008 439 AbYSS: Adapting Scatter Search to Multiobjective Optimization Antonio J. Nebro, Francisco Luna, Student Member, IEEE, Enrique
More information