PLEASE SCROLL DOWN FOR ARTICLE

Size: px
Start display at page:

Download "PLEASE SCROLL DOWN FOR ARTICLE"

Transcription

1 This article was downloaded by: [Yuan Ze University] On: 22 October 2008 Access details: Access Details: [subscription number ] Publisher Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: Registered office: Mortimer House, Mortimer Street, London W1T 3JH, UK Engineering Optimization Publication details, including instructions for authors and subscription information: Solving constrained optimization problems with hybrid particle swarm optimization Erwie Zahara a ; Chia-Hsin Hu a a Department of Industrial Engineering and Management, St. John's University, Tamsui, Taiwan, Republic of China Online Publication Date: 01 November 2008 To cite this Article Zahara, Erwie and Hu, Chia-Hsin(2008)'Solving constrained optimization problems with hybrid particle swarm optimization',engineering Optimization,40:11, To link to this Article: DOI: / URL: PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.

2 Engineering Optimization Vol. 40, No. 11, November 2008, Solving constrained optimization problems with hybrid particle swarm optimization Erwie Zahara* and Chia-Hsin Hu Department of Industrial Engineering and Management, St. John s University, Tamsui, Taiwan 251, Republic of China (Received 17 August 2007; final version received 9 January 2008 ) Constrained optimization problems (COPs) are very important in that they frequently appear in the real world. A COP, in which both the function and constraints may be nonlinear, consists of the optimization of a function subject to constraints. Constraint handling is one of the major concerns when solving COPs with particle swarm optimization (PSO) combined with the Nelder Mead simplex search method (NM- PSO). This article proposes embedded constraint handling methods, which include the gradient repair method and constraint fitness priority-based ranking method, as a special operator in NM-PSO for dealing with constraints. Experiments using 13 benchmark problems are explained and the NM-PSO results are compared with the best known solutions reported in the literature. Comparison with three different metaheuristics demonstrates that NM-PSO with the embedded constraint operator is extremely effective and efficient at locating optimal solutions. Keywords: constrained optimization; Nelder Mead simplex search method; particle swarm optimization; constraint handling 1. Introduction Constrained optimization is a mathematical programming technique that attempts to solve nonlinear objective functions, or deals with constraints that have a nonlinear relationship, or handles both at the same time. It has become an important branch of operations research and has a wide variety of applications in such areas as the military, economics, engineering optimization and management science (Nash and Sofer 1996). Constrained optimization problems (COPs) with n variables and m constraints may be written in the following canonical form: Minimize f(x) (1) subject to g j (x) 0,j = 1,...,q h j (x) = 0,j = q + 1,...,m l i x i u i,i = 1,...,n *Corresponding author. erwi@mail.sju.edu.tw ISSN X print/issn online 2008 Taylor & Francis DOI: /

3 1032 E. Zahara and C.-H. Hu where x = (x 1,x 2,,x n ) is a one-dimensional vector of n decision variables, f(x) is an objective function, g j (x) 0 are q inequality constraints and h j (x) = 0 are m q equality constraints. The values u i and l i are the upper and the lower bounds of x i, respectively. Traditionally, most of the methods found in the literature search for approximate solutions to COPs and assume that the goal and constraints are differentiable (Fung et al. 2002). In the case that the goal and constraints are not differentiable, heuristic methods are employed. Optimization methods using heuristics, which have helped researchers formulate such new stochastic algorithms as genetic algorithms (Tang et al. 1998), evolutionary algorithms (Runarsson and Yao 2005) and particle swarm optimization (PSO) (Dong et al. 2005), are of little practical use in solving COPs due to slow convergence rates. Fan and Zahara (2007) have demonstrated the hybrid Nelder Mead simplex method and particle swarm optimization method (NM-PSO) to be a promising and viable tool for solving unconstrained nonlinear optimization problems. This article demonstrates that this algorithm can also be adopted to solve constrained nonlinear optimization problems. Until recently, the most commonly used constraint handling technique for stochastic optimization was the penalty method (Coello Coello 2002). The penalty method converts a COP to an unconstrained problem by adding, to the objective function, the constraints as a penalty term. When a solution is deemed infeasible, meaning that the solution satisfies the constraints partially, the objective value is penalized by a penalty set beforehand. The difficulty of using this method is in the selection of an appropriate penalty value for the problem at hand. To overcome this drawback, some researchers have suggested replacing the constant penalty by a variable penalty sequence according to the degree of constraint violation (Farmani and Wright 2003). However, this type of scheme usually ends up with the difficulty of selecting another set of parameters (Coello Coello 2002). In contrast, a constraint fitness priority-based ranking method (Dong et al. 2005) attempts to deal with the difficulty of selecting an appropriate penalty value by constructing a constraint fitness function. A constraint fitness priority-based ranking method (Dong et al. 2005) and a repair method using the gradient information derived from the constraint set (Chootinan and Chen 2006) are embedded in NM-PSO to evaluate the particles during the search when solving COPs. The superior performance of the proposed NM-PSO will be demonstrated by solving 13 benchmark COPs. The remainder of this article is organized as follows. The gradient repair method and constraint fitness priority-based ranking method for handling constraints are briefly discussed in Section 2. The framework of hybrid Nelder Mead simplex search and PSO are detailed in Section 3. In Section 4, the results obtained in this study are compared against the optimal or best known solutions of the 13 benchmark problems reported in the literature. The application of the proposed method to a ceramic grinding optimization problem is discussed in Section 5; finally, conclusions are drawn in Section 6 based on the results. 2. Constraint handling methods This section introduces two constraint handling methods: the gradient repair method and constraint fitness priority-based ranking method, which will be embedded in NM-PSO The gradient repair method The gradient repair method was proposed by Chootinan and Chen (2006). It utilizes gradient information derived from the constraint set to systematically repair infeasible solutions by directing infeasible solutions toward the feasible region. The procedure is summarized as follows.

4 Engineering Optimization 1033 (1) For any solution, determine the degree of constraint violation V by Equation (2). [ ] [ ] g V = q 1 Min{0, g(x)} = V = (2) h (m g) 1 h(x) m 1 where V consists of vectors of inequality constraints (g) and equality constraints (h) for the problem. (2) Compute x V, where x V are the derivatives of these constraints with respect to the solution vector (n decision variables) and e is a small scalar for perturbation. [ ] [ ] x g x V = 1 g(x xi x h e = x i + e) g(x), i = 1, 2,...n (3) h(x x i = x i + e) h(x), i = 1, 2,...n m n Therefore, the relationship between changes of constraint violation V and solution vector x is expressed by m n V = x V x = x = x V 1 V (4) (3) Compute the Moore Pensore inverse or pseudoinverse ( x V +), which is the approximate inverse of x V to be used instead in Equation (3). (4) Update the solution vector by x t+1 = x t + x = x t + x V 1 V x t + x V + V. (5) Loop through steps 1 to 4 at most five times. The above procedure is demonstrated using the following example in which there are four inequality constraints and one equality constraint, given by Equation (5). g 1 : x1 2 + x x1 2 g 2 : x x x 1 2x 2 x 1 3 g 3 : x 1 0 = V = x 1 g 4 : x 2 0 x 2 = 1 0 xv = 1 0 (5) 0 1 h 1 : x 1 + x 2 5 = 0 x 1 + x (1) Suppose the first iteration produces [x 1,x 2 ]=[6, 0], which clearly violates g 1, g 2 and h 1. Calculate the degree of constraint violation V V = 6 = V = (2) Compute x V and the Moore Pensore inverse or pseudoinverse ( x V +) for the violated constraints [ ] x V = 1 0 = x V = (3) Update the solution vector by x t+1 = x t + x x t + x V + V. [ [ ] 11 x 2 = x x = + 3 = 0] [ ]

5 1034 E. Zahara and C.-H. Hu (4) The same process is repeated at most five times until the solution is finally moved to [x 1,x 2 ]= [3, 2], which is in the feasible region Constraint fitness priority-based ranking method This method, proposed by Dong et al. (2005), solves the difficulty of selecting an appropriate penalty value that appears in the penalty method. The method introduces a constraint fitness function Cf (x) for constraints, and it is computed from both inequality and equality constraints as follows. For inequality constraints g j (x) 0, 1, If g j (x) 0 C j (x) = 1 g j (x) g max (x), If g (6) j (x) >0 where g max (x) = max{g j (x), j = 1, 2, q} and C j (x) is the fitness level of point x for constrained condition (j). For equality constraints h j (x) = 0, 1, If h j (x) = 0 C j (x) = 1 h j (x) h max (x), If h (7) j (x) = 0 where h max (x) = max { h j (x), j = q + 1,q + 2,...m }. The constraint fitness function evaluated at point x is equal to the weighted sum of the C j s, as defined below: m m Cf (x) = w j C j (x), w j = 1, 0 w j 1/m, j. (8) j=1 j=1 where w j is a randomly generated weight for constraint j. The sum signifies the fitness level of point x as related to the feasible domain Q; that is, if Cf (x) = 1, it is an indication that x Q, and on the contrary, if 0 <Cf(x) <1, the smaller Cf (x) is, the less likely that x is in the feasible domain Q or the further away x is from Q. 3. Hybrid Nelder Mead and particle swarm optimization The goal of integrating the Nelder Mead simplex method (NM) and particle swarm optimization (PSO) is to combine their advantages and avoid disadvantages. For example, the NM is a very efficient local search procedure but its convergence is extremely sensitive to the selected starting point; PSO belongs to the class of global search procedures but requires much computational effort (Fan and Zahara 2007) The Nelder Mead simplex search method The NM method, proposed by Nelder and Mead (1965), is a local search method designed for unconstrained optimization without using gradient information. The operations of this

6 Engineering Optimization 1035 method rescale the simplex based on the local behaviour of the function by using four basic procedures: reflection, expansion, contraction and shrinkage. Through these procedures, the simplex can successively improve itself and zero in on the optimum. The steps of NM, which are modified to handle a minimization COP, are as follows. (1) Initialization. An initial N + 1 vertex points are randomly generated in their respective search range or space. Evaluate the fitness of the objective function and the constraint fitness at each vertex point of the simplex. For the maximization case, it is convenient to transform the problem into the minimization case by pre-multiplying the objective function by 1. (2) Reflection. Determine X high and X low, vertices with the highest and the lowest function values, respectively. Let f high,f low and Cf high,cf low represent the corresponding objective function values and constraint function values, respectively. Find x cent, the centre of the simplex without x high in the minimization case. Generate a new vertex x refl by reflecting the worst point according to x refl = (1 + α)x cent αx high (α > 0) (9) Nelder and Mead suggested that α = 1. If Cf refl < 1 and Cf refl >Cf low or Cf refl = 1 and f refl <f low, then proceed with expansion; otherwise proceed with contraction. (3) Expansion. After reflection, two possible expansion cases need to be considered. (3.1) If Cf refl < 1 and Cf refl >Cf low, the simplex is expanded in order to extend the search space in the same direction and the expansion point is calculated by Equation (10) x exp = γ x refl + (1 γ)x cent (γ > 1) (10) Nelder and Mead suggested γ = 2. If Cf exp >Cf low or Cf exp = 1, the expansion is accepted by replacing x high with x exp ; otherwise, x refl replaces x high. (3.2) If Cf refl = 1 and f refl <f low, the expansion point is calculated by Equation (10). If Cf exp = 1 and f exp <f low, the expansion is accepted by replacing x high with x exp ; otherwise, x refl replaces x high. (3.3) Exit the algorithm if the stopping criteria are satisfied; otherwise go to step (2). (4) Contraction. After reflection, there are two possible contraction cases to consider. (4.1) If Cf refl < 1 and Cf refl Cf low, the contraction vertex is calculated by Equation (11): x cont = βx high + (1 β)x cent (0 <β<1) (11) Nelder and Mead suggested β = 0.5. If Cf cont >Cf low or Cf cont = 1, the contraction is accepted by replacing x high with x cont ; otherwise do shrinking in step (5.1). (4.2) If Cf refl = 1 and f refl f low and if f refl f high, then x refl replaces x high and contraction by Equation (11) is carried out. If f refl >f high, then direct contraction without the replacement of x high by x refl is performed. If Cf cont = 1&f cont <f low, the contraction is accepted by replacing x high with x cont ; otherwise do shrinking in step (5.2). (4.3) Exit the algorithm if the stopping criteria are satisfied; otherwise go to step (2). (5) Shrinkage. (5.1) Following step (4.1). in which Cf cont Cf low and contraction has failed, try shrinkage attempts for all points except x low by Equation (12): x i δx i + (1 δ)x low (0 <δ<1) (12) Nelder and Mead suggested δ = 0.5. (5.2) Contraction has failed in step (4.2) where Cf cont < 1orCf cont = 1 but f cont f low. Shrink the entire simplex except x low by Equation (12). (5.3) Exit the algorithm if the stopping criteria are satisfied; otherwise go to step (2).

7 1036 E. Zahara and C.-H. Hu 3.2. Particle swarm optimization Particle swarm optimization (PSO) is one of the latest evolutionary optimization techniques developed by Eberhart and Kennedy (1995). The PSO concept is based on a metaphor of social interaction such as bird flocking or fish schooling. Similar to genetic algorithms, PSO is also population-based and evolutionary in nature, with one major difference from genetic algorithms in that it does not implement filtering, i.e. all members of the population survive through the entire search process. PSO simulates a commonly observed social behaviour, where members of a group tend to follow the leader of the best among the group. The procedure of PSO is reviewed below. (1) Initialization. Randomly generate a swarm of potential solutions called particles and assign a random velocity to each. (2) Velocity update. The particles are then flown through hyperspace by updating their own velocity. The velocity update of a particle is dynamically adjusted, subject to its own past flight and those of its companions. The particle s velocity and position are updated using V New id x New id (t + 1) = c 0 Vid old (t) + c 1 rand() (p id (t) xid old (t)) + c 2 rand() (p gd (t) xid old (t)) (13) (t + 1) = xid old New (t) + Vid (t + 1) (14) where c 1 and c 2 are two positive constants; c 0 is an inertia weight and rand( ) is a random value inside (0, 1). Eberhart and Shi (2001) and Hu and Eberhart (2001) suggested c 1 = c 2 = 2 and c 0 =[0.5 + (rand()/2.0)]. Equation (13) illustrates the calculation of a new velocity for each individual. The velocity of each particle is updated according to its previous velocity (V id ), the particle s previous best location (p id ) and the global best location (p gd ). Particle velocities on each dimension are clamped to a maximum velocity V max, which is a fraction in the search space on each dimension. Equation (14) shows how each particle s position is updated in the search space Hybrid NM-PSO method Embedding the constraint handling methods in NM-PSO is now presented. The population size of this hybrid NM-PSO approach is set at 21N + 1 when solving an N-dimensional problem. The initial population is randomly generated in the problem search space. For every particle that violates the constraints, use the gradient repair method to direct the infeasible solution toward the feasible region. In most cases, the repair method does move the solution to the feasible region. However, there may be times when the repair method fails; in such cases, the infeasible solutions are left as they are and the process continues with the next step. The constraint fitness prioritybased ranking method is then adopted to evaluate the constraint fitness value of each particle and ranks them according their constraint fitness values. The one with the greatest (constraint fitness) value computed by formulation (8) is placed at the top. If two particles have the same constraint fitness value, then their objective fitness values are compared in order to determine their relative position. The one with a better objective fitness value is positioned in front of the other. The top N + 1 particles are then fed into the NM method to improve the (N + 1) th particle. The PSO method adjusts the other 20N particles by taking into account the positions of the N + 1 best particles. This procedure for adjusting the 20N particles involves selection of the global best particle, selection of the neighbourhood best particles and, finally, velocity updates. The global best particle of the population is determined according to the sorted fitness values. Next, the 20N particles are evenly divided into 10N neighbourhoods with 2 particles per neighbourhood. The particle with the better fitness value in each neighbourhood is denoted as the neighbourhood

8 Engineering Optimization 1037 Figure 1. The hybrid NM-PSO algorithm embedded with constraint handling methods. best particle. By Equations (13) and (14), a velocity update for each of the 20N particles is then carried out. The 21N + 1 particles are sorted in preparation for repeating the entire run. The process terminates when a certain convergence criterion is satisfied. Figure 1 summarizes the algorithm of NM-PSO. For further details of the hybrid NM-PSO for unconstrained optimization problem, see Fan and Zahara (2007). 4. Experimental results In this study, the proposed NM-PSO was applied to solve 13 benchmark problems studied by Becerra and Coello (2006). This test suite is a set of difficult nonlinear functions for constrained optimization and includes four maximization problems G02, G03, G08 and G12. The characteristics of these problems include the number of decision variables (n), the type of objective function, the size of feasible region (k), the number of linear inequalities (LI), nonlinear equalities (NE) and nonlinear inequalities (NI), and the number of active constraints at the optimum (a) are summarized in Table 1. The size of the feasible region, empirically determined by simulation (Koziel and Michaelwicz 1999), indicates the degree of difficulty of randomly generating a feasible solution the greater the size, the higher the degree of difficulty. The forms of the 13 problems are described completely in Appendix 1. The algorithm is coded in Matlab 7.0 and the simulations were run on a Pentium IV 2.4 GHz with 512 MB memory capacity. Table 1. Characteristics of benchmark problems. n Type of f (x) k (%) LI NE NI a G01 13 Quadratic G02 20 Nonlinear G03 10 Polynomial G04 5 Quadratic G05 4 Cubic G06 2 Cubic G07 10 Quadratic G08 2 Non-linear G09 7 Polynomial G10 8 Linear G11 2 Quadratic G12 3 Quadratic G13 5 Exponential

9 1038 E. Zahara and C.-H. Hu During the simulation, it was found that the population size has a significant effect on the performance of NM-PSO. Table 2 shows the performance of NM-PSO with different population sizes for test problems G01, G04, G05 and G08 and it is clearly seen that for the population sizes 21N + 1 and 31N + 1, the same performance is achieved with better solutions than for 11N + 1. Therefore, a population size of 21N + 1 was employed in NM-PSO algorithm instead of 31N + 1 to decrease the number of function evaluations. The task of optimizing each of the test functions was executed 20 times by NM-PSO, and the search process was iterated 5000 times until the reference or a better solution was found. Table 3 contains a summary of the execution results of NM-PSO, and includes the worst and the best solutions obtained, the means, the medians and the standard deviations of the solutions, the average number of function evaluations, the average number of iterations and the average number of computation times. For the sake of comparison, the table also gives the reference optimal values. In every problem, the best solutions are almost equal to the reference optimal solutions. For problems G05 and G10, the optimal solutions are better than the reference. For problems G01, G03, G04, G05, G06, G08, G09, G11 and G12, optimal solutions were found consistently in all 20 runs. For problems G02 and G13, the optimal solutions were not consistently found. Finally, for problems G07 and G10, quite satisfactory global optima were found. In terms of the efficiency of NM-PSO, Table 3 shows that more than half of test problems converged in less than 300 iterations (or equivalently less than 50 s) and problems G09 and G13 converged in less than 2800 iterations (less than 240 s). However, test problems G02, G07, and G10 never completely converged; they all stopped at the pre-set maximum number of iterations (5000) with a high standard deviation, signifying that the solutions found thus far are still quite far apart. These 13 benchmark problems were solved by NM-PSO and the solutions are now compared with those obtained by applying the cultural differential evolution (CDE) algorithm (Becerra and Coello 2006), the filter simulated annealing (FSA) method (Hedar and Fukushima 2006), the genetic algorithm (GA) (Chootinan and Chen 2006) and the evolutionary algorithm (EA) (Runarsson and Yao 2005). Table 4 lists the results of solving these problems by the various methods using 20 runs in terms of the best, the mean, and the worst solutions, the standard deviation, and the average of function evaluations of each method. Solutions to problems G12 and G13 by GA are not readily available and therefore are not included in the table. Better values Table 2. The effect of population size on NM-PSO. Population size 11N N N + 1 G01 (min) Best Mean Worst Std G04 (min) Best Mean Worst Std e e 006 G05 (min) Best Mean Worst Std e e 004 G08 (max) Best Mean Worst Std e e 008

10 Table 3. Results of test problems using NM-PSO. Reference Best objective Median objective Mean objective Standard Worst objective Average no. of Average no. Average computation Type optimal value value value value deviation value func evaluations of iterations time (s) G01 Min G02 Max G03 Max G04 Min G05 Min G06 Min G07 Min G08 Max G09 Min G10 Min G11 Min G12 Max G13 Min Engineering Optimization 1039

11 1040 E. Zahara and C.-H. Hu Table 4. Comparison results of test problems with CDE, FSA, GA and ES. Best objective Mean objective Worst objective Standard Average no. of Type Algorithm value value value deviation func evaluations G01 Min CDE FSA GA EA NM-PSO G02 Max CDE FSA GA EA NM-PSO G03 Max CDE FSA GA EA NM-PSO G04 Min CDE FSA GA EA NM-PSO G05 Min CDE FSA GA EA NM-PSO G06 Min CDE FSA GA EA NM-PSO G07 Min CDE FSA GA EA NM-PSO G08 Max CDE FSA GA EA NM-PSO G09 Min CDE FSA GA EA NM-PSO G10 Min CDE FSA GA EA NM-PSO G11 Min CDE FSA GA (Continued)

12 Engineering Optimization 1041 Table 4. Continued. Best objective Mean objective Worst objective Standard Average no. of Type Algorithm value value value deviation func evaluations EA NM-PSO G12 Max CDE FSA EA NM-PSO G13 Min CDE FSA EA NM-PSO in each category are highlighted in bold type. Upon examining the results in Table 4, it is seen that NM-PSO consistently found the global optimal solutions in 10 test problems out of the 13, while the other methods found the optimum for at most 8 test problems. In terms of computational costs, NM-PSO outperformed the other four methods in 8 problems out of the 13. Considering both solution quality and computational effort, NM-PSO is superior to the CDE method in 10 test problems out of 13 and surpassed the other three methods in all 13 problems, making NM- PSO a clear winner. The superiority of NM-PSO over the other methods can be attributed to two factors. (1) NM-PSO can effectively reach a global optimum because of the mechanism of the gradient repair method, which ensures that no infeasible intermediate solution will ever enter consideration and corrupt the solution quality. The gradient repair mechanism also helps transform a constrained problem into an unconstrained one once all constraints have been satisfied. (2) The hybrid NM PSO algorithm is a proven technique (Fan and Zahara 2007) for efficiently solving unconstrained optimization problems. Additionally, NM-PSO is easy to use because it does not call for the selection of an appropriate penalty value for every test problem. It can therefore be concluded that NM-PSO is a robust and suitable tool for solving COPs. The convergence behaviour of NM-PSO falls into two general patterns. The repair method either moves an infeasible solution into the feasible region the first time the method is invoked, or it will take several iterations for the method to be effective. Solving test problems G01 and G06 will provide more insight into the convergence behaviour of NM-PSO. Figures 2 and 3 depict the relationship between the number of iterations, values of constraint fitness function and values of objective fitness function. Figure 2(a) shows the first pattern, in which the constraint fitness value reaches 1 after the repair procedure is called for the first time. Figure 2(b) shows that the objective fitness value keeps decreasing for minimization problem G01 until NM-PSO reaches the global optimum after 40 iterations. In the case of a maximization problem, the objective fitness value would continue to increase. Figure 3(a) shows the second pattern, in which the constraint fitness value does not reach 1 until the repair method is called for the eighth iteration. Before this happens, the objective fitness value would oscillate, as seen in Figure 3(b). This means that until 1 is reached by the constraint fitness function, the intermediate solution satisfies all the constraints and the objective value begins to decrease. Figures 2 and 3 convey one important message the repair method helps set NM-PSO on the right course of optimization and when infeasible solutions have been repaired and all solutions satisfy the constraints, NM-PSO can proceed quickly to the global optimum.

13 1042 E. Zahara and C.-H. Hu Figure 2. (a) Relation between iteration and constraint fitness function for G01. (b) Relation between iteration and objective fitness function for G01. It can be concluded that empowering the NM-PSO algorithm with constraint handling methods is a recommended technique for solving COPs. 5. Application to a ceramic grinding optimization problem Many engineering applications involve the use of advanced structural ceramics such as silicon carbide (SiC), silicon nitride, alumina or partially stabilized zirconium. These materials feature a high strength at elevated temperatures, resistance to chemical degradation, wear resistance and low density. The effective use of these structural ceramics in functional applications demands the grinding of ceramic components with good surface finish and low surface damage. The main concern about the use of ceramics in industry is the complexity involved in machining because of

14 Engineering Optimization 1043 Figure 3. (a) Relation between iteration and constraint fitness function for G06. (b) Relation between iteration and objective fitness function for G06. their high hardness and low fracture toughness. Therefore, optimization in ceramic processing and manufacturing technology is necessary for the commercialization of ceramic use (Lee et al. 2007). A model available in the literature (Gopal and Rao 2003) is adopted. The objective function of the optimization problem of the ceramic grinding process can be described as maximizing the material removal rate (MRR) subject to a set of constraints comprising surface roughness, number of flaws and input variables. The problem is a COP and is formulated as the objective function: Max MRR = f d c subject to R a R a(max),r a = 0.145dc f M N c N c(max),n c = 29.67dc f μm d c 30μm 8.6m/ min f 13.4m/ min 120 M 500 (15)

15 1044 E. Zahara and C.-H. Hu Table 5. Optimization results for ceramic grinding optimization problem. Constraints Value of variables Value of constraints R a(max) N c(max) d c f M R a N c Best MRR where f is the table feed rate (m/min), d c is the depth of cut (μm), R a(max) and N c(max) are the maximum allowable values of surface roughness and the number of flaws, respectively and M is grit size. Therefore, this is a three-dimensional problem whereby the values of d c, f and M are determined by NM-PSO. The NM-PSO algorithm was run for 100 trials on the same problem with a total of 9 different combinations of constraints; the search process was iterated 100 times and the results are recorded in Table 5. In all 9 cases, no violation was observed for any constraints. This means Table 6. Statistical comparison between GA, PSO and NM-PSO. Standard Number of function R a(max) N c(max) Algorithm Best MRR Mean MRR Worst MRR deviation evaluations GA PSO e NM-PSO e GA PSO e NM-PSO e GA PSO NM-PSO GA PSO e NM-PSO e GA PSO e NM-PSO e GA PSO e NM-PSO GA PSO e NM-PSO e GA PSO e NM-PSO e GA PSO e NM-PSO e

16 Engineering Optimization 1045 that NM-PSO is efficient and practical for this engineering problem. Thus, it will be more advantageous to grind SiC at a depth cut of d c = μm and feed rate f = m/min with a grit size M = mesh. For a higher range of the number of flaws, an increase of 30 to 60% in MRR can be observed by increasing the roughness value from 0.3 to 0.4 μm. This can be a good approach to achieve the specified MRR with constraints on surface roughness and surface damage. To prove the superiority and robustness of NM-PSO, the results are compared with those of GA and PSO (Gopal and Rao 2003) in Table 6. It can be observed that NM-PSO found better MRR solutions forn c(max) equal to 9 or 11. For N c(max) equal to 7, the solution is the same as that of PSO, but NM-PSO requires less function evaluations to reach the global optimum. The standard deviations associated with NM-PSO are quite small, signifying that NM-PSO converges with a high degree of accuracy. 6. Conclusions In this article, NM-PSO with embedded constraint handling methods is proposed for solving COPs. Experimental results clearly illustrate the attractiveness of this method by handling several types of constraints. It can produce competitive, if not better, solutions as compared to CDE, FSA and GA and appears to be a promising constraint handling technique. As an application, NM-PSO was employed to solve the optimization problem of silicon carbide grinding and it was proved that NM-PSO is superior in comparison with GA and PSO. In the future, NM-PSO will be applied to various real-world problems. Additionally, it would be desirable to explore the benefits of using a belief space in other types of problems. The authors are also interested in extending the approach to deal with multi-objective problems. References Becerra, R.L. and Coello, C.A., Cultured differential evolution for constrained optimization. Computer Methods in Applied Mechanics and Engineering, 195 (33 36), Chootinan, P. and Chen, A., Constraint handling in genetic algorithms using a gradient-based repair method. Computers and Operations Research, 33 (8), Coello Coello, C.A., Theoretical and numerical constraint handling techniques used with evolutionary algorithms; a survey of the state of art. Computer Methods in Applied Mechanics and Engineering, 191 (11 12), Dong, Y., Tang, J.F., Xu, B.D., and Wang, D., An application of swarm optimization to nonlinear programming. Computers and Mathematics with Applications, 49 (11 12), Eberhart, R.C. and Kennedy, J., A new optimizer using particle swarm theory. Proceedings of the Sixth International Symposium on Micro Machine and Human Science, Nagoya, Japan, 4 6 October, 39 43, IEEE Service Center. Eberhart, R.C. and Shi,Y., Tracking and optimizing dynamic systems with particle swarms. Proceedings of Congress on Evolutionary Computation, Seoul, Korea, May, 94 97, IEEE Service Center. Fan, S.K. and Zahara, E., A hybrid simplex search and particle swarm optimization for unconstrained optimization. European Journal of Operational Research, 181 (2), Farmani, R. and Wright, J.A., Self-adaptive fitness formulation for constrained optimization. IEEE Transactions on Evolutionary Computation, 7 (5), Fung, R.Y.K., Tang, J.F., and Wang, D., Extension of a hybrid genetic algorithm for nonlinear programming problems with equality and inequality constraints. Computers and Operations Research, 29 (3), Gopal, A.V. and Rao, P.V., The optimization of the grinding of silicon carbide with diamond wheels using genetic algorithms. International Journal of Advanced Manufacturing Technology, 22 (7 8), Hedar, A.D. and Fukushima, M., Derivative-free filter simulated annealing method for constrained continuous global optimization. Journal of Global Optimization, 35 (4), Hu, X. and Eberhart, R.C., Tracking dynamic systems with PSO: where s the cheese? Proceedings of Workshop on Particle Swarm Optimization, Indianapolis, IN, USA Purdue School of Engineering and Technology. Koziel, S. and Michaelwicz, Z., Evolutionary algorithms, homomorphous mappings, and constrained parameter optimization. Evolutionary Computation, 7 (1),

17 1046 E. Zahara and C.-H. Hu Lee, T.S., Ting, T.O., Lin, Y.J., and Htay, T., A particle swarm approach for grinding process optimization analysis. International Journal of Advanced Manufacturing Technology, 33 (11 12), Nash, S.G. and Sofer, A., Linear and nonlinear programming. New York: McGraw-Hill. Nelder, J.A. and Mead, R., A simplex method for function minimization. Computer Journal, 7, Runarsson, T.P. andyao, X., Search biases in constrained evolutionary optimization. IEEE Transactions on Systems, Man and Cybernetics, 35 (2), Tang, J.F., Wang, D.W., Ip, A., and Fung, R.Y.K., A hybrid genetic algorithm for a type of non-linear programming problems. Computers and Mathematics with Applications, 36 (5), Appendix 1: Benchmark problems G01 Minimize f(x)= 5 x i 5 xi 2 x i i=1 i=1 i=1 g 1 (x) = 2x 1 + 2x 2 + x 10 + x g 2 (x) = 2x 1 + 2x 3 + x 10 + x g 3 (x) = 2x 2 + 2x 3 + x 11 + x g 4 (x) = 8x 1 + x 10 0 g 5 (x) = 8x 2 + x 10 0 g 6 (x) = 8x 3 + x 12 0 g 7 (x) = 2x 4 x 5 + x 10 0 g 8 (x) = 2x 6 x 7 + x 10 0 g 9 (x) = 2x 8 x 9 + x x i 1,i = 1, 2,...,9, 0 x i 100,i = 10, 11, 12, 0 x i 1,i = 13, G02 Maximize ni=1 cos 4 (x i ) 2 n f(x)= i=1 cos 2 (x i ) ni=1 ixi 2 g 1 (x) = 0.75 n i=1 x i 0 g 2 (x) = n i=1 x i 7.5n 0 n = 20 0 x i 10,i = 1,...,n G03 Maximize f(x)= ( n n) n x i i=1 n h 1 (x) = xi 2 1 = 0 i=1 n = 10 0 x i 1,i = 1,...,n

18 Engineering Optimization 1047 G04 Minimize f(x)= x x 1x x g 1 (x) = x 2 x g 2 (x) = x 2 x x 1 x x 3 x 5 0 g 3 (x) = x 2 x x 1 x x g 4 (x) = x 2 x x 1 x x g 5 (x) = x 3 x x 1 x x 3 x g 6 (x) = x 3 x x 1 x x 3 x x x x i 45,i = 3, 4, 5 G05 Minimize f(x)= 3x x x 2 + ( /3)x2 3 g 1 (x) = x 4 + x g 2 (x) = x 3 + x h 3 (x) = 1000 sin( x ) sin( x ) x 1 = 0 h 4 (x) = 1000 sin(x ) sin(x 3 x ) x 1 = 0 h 5 (x) = 1000 sin(x ) sin(x 4 x ) = 0 0 x x x i 0.55,i = 3, 4, G06 Minimize f(x)= (x 1 10) 3 + (x 2 20) 3 g 1 (x) = (x 1 5) 2 (x 2 5) g 2 (x) = (x 1 6) 2 + (x 2 5) x x G07 Minimize f(x)= x1 2 + x2 2 + x 1x 2 14x 1 16x 2 + (x 3 10) 2 + 4(x 4 5) 2 + (x 5 3) 2 + 2(x 6 1) 2 +5x (x 8 11) 2 + 2(x 9 10) 2 + (x 10 7) g 1 (x) = x i + 5x 2 3x 7 + 9x 8 0 g 2 (x) = 10x 1 8x 2 17x 7 + 2x 8 0 g 3 (x) = 8x 1 + 2x 2 + 5x 9 2x g 4 (x) = 3(x 1 2) 2 + 4(x 2 3) 2 + 2x3 2 7x g 5 (x) = 5x 1 + 8x 2 + (x 3 6) 2 2x g 6 (x) = x (x 2 2) 2 2x 1 x x 5 6x 6 0 g 7 (x) = 0.5(x 1 8) 2 + 2(x 2 4) 2 + 3x5 2 x g 8 (x) = 3x 1 + 6x (x 9 8) 2 7x x i 10,i = 1,...10

19 1048 E. Zahara and C.-H. Hu G08 Maximize f(x)= sin2 (2πx 1 ) sin(πx 2 ) x1 3(x 1 + x 2 ) g 1 (x) = x1 2 x g 2 (x) = 1 x 1 + (x 2 4) x i 10,i = 1, 2 G09 Minimize f(x)= (x 1 10) 2 + 5(x 2 12) 2 + x (x 4 11) x x2 6 + x4 7 4x 6x 7 10x 6 8x 7 g 1 (x) = x x4 2 + x 3 + 4x x 5 0 g 2 (x) = x 1 + 3x x3 2 + x 4 x 5 0 g 3 (x) = x 1 + x x2 6 8x 7 0 g 4 (x) = 4x1 2 + x2 2 3x 1x 2 + 2x x 6 11x x i 10,i = 1,...,7 G10 Minimize f(x)= x 1 + x 2 + x 3 g 1 (x) = (x 4 + x 6 ) 0 g 2 (x) = (x 6 + x 7 x 4 ) 0 g 3 (x) = (x 8 x 5 ) 0 g 4 (x) = x 1 x x x g 5 (x) = x 2 x x 5 + x 2 x x 4 0 g 6 (x) = x 3 x x 3 x x x x i 10000,i = 2, 3, 10 x i 1000,i = 4,...,8, G11 Minimize f(x)= x1 2 + (x 1) 2 h(x) = x 2 x x i 1,i = 1, 2, G12 Maximize f(x)= (100 (x 1 5) 2 (x 2 5) 2 (x 3 5) 2 )/100 g(x) = (x i p) 2 + (x 2 q) 2 + (x 3 r) x i 10,i = 5, 5, 5,

20 Engineering Optimization 1049 G13 Minimize f(x)= e x 1x 2 x 3 x 4 x 5 h 1 (x) = x1 2 + x2 2 + x2 3 + x2 4 + x2 5 h 2 (x) = x 2 x 3 5x 4 x 5 = 0 h 3 (x) = x1 3 + x = x i 2.3,i = 1, 2, 3.2 x i 3.2,i = 3, 4, 5,

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE Fang Wang, and Yuhui Qiu Intelligent Software and Software Engineering Laboratory, Southwest-China Normal University,

More information

Research Article A New Hybrid Nelder-Mead Particle Swarm Optimization for Coordination Optimization of Directional Overcurrent Relays

Research Article A New Hybrid Nelder-Mead Particle Swarm Optimization for Coordination Optimization of Directional Overcurrent Relays Mathematical Problems in Engineering Volume 2012, Article ID 456047, 18 pages doi:10.1155/2012/456047 Research Article A New Hybrid Nelder-Mead Particle Swarm Optimization for Coordination Optimization

More information

A hybrid constrained optimization approach coupling PSO and adaptive constraint-handling technique

A hybrid constrained optimization approach coupling PSO and adaptive constraint-handling technique A hybrid constrained optimization approach coupling PSO and adaptive constraint-handling technique WEN LONG, SHAOHONG CAI, JIANJUN JIAO, WENZHUAN ZHANG Guizhou Key Laboratory of Economics System Simulation

More information

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Dervis Karaboga and Bahriye Basturk Erciyes University, Engineering Faculty, The Department of Computer

More information

ACONM: A hybrid of Ant Colony Optimization and Nelder-Mead Simplex Search

ACONM: A hybrid of Ant Colony Optimization and Nelder-Mead Simplex Search ACONM: A hybrid of Ant Colony Optimization and Nelder-Mead Simplex Search N. Arun & V.Ravi* Assistant Professor Institute for Development and Research in Banking Technology (IDRBT), Castle Hills Road #1,

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

Luo, W., and Li, Y. (2016) Benchmarking Heuristic Search and Optimisation Algorithms in Matlab. In: 22nd International Conference on Automation and Computing (ICAC), 2016, University of Essex, Colchester,

More information

Hybrid Optimization Coupling Electromagnetism and Descent Search for Engineering Problems

Hybrid Optimization Coupling Electromagnetism and Descent Search for Engineering Problems Proceedings of the International Conference on Computational and Mathematical Methods in Science and Engineering, CMMSE 2008 13 17 June 2008. Hybrid Optimization Coupling Electromagnetism and Descent Search

More information

MATH3016: OPTIMIZATION

MATH3016: OPTIMIZATION MATH3016: OPTIMIZATION Lecturer: Dr Huifu Xu School of Mathematics University of Southampton Highfield SO17 1BJ Southampton Email: h.xu@soton.ac.uk 1 Introduction What is optimization? Optimization is

More information

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization adfa, p. 1, 2011. Springer-Verlag Berlin Heidelberg 2011 Devang Agarwal and Deepak Sharma Department of Mechanical

More information

Particle Swarm Optimization to Solve Optimization Problems

Particle Swarm Optimization to Solve Optimization Problems Particle Swarm Optimization to Solve Optimization Problems Gregorio Toscano-Pulido and Carlos A. Coello Coello Evolutionary Computation Group at CINVESTAV-IPN (EVOCINV) Electrical Eng. Department, Computer

More information

Automatic Export of PubMed Citations to EndNote Sue London a ; Osman Gurdal a ; Carole Gall a a

Automatic Export of PubMed Citations to EndNote Sue London a ; Osman Gurdal a ; Carole Gall a a This article was downloaded by: [B-on Consortium - 2007] On: 20 July 2010 Access details: Access Details: [subscription number 919435511] Publisher Routledge Informa Ltd Registered in England and Wales

More information

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION International Journal of Computer Engineering and Applications, Volume VIII, Issue I, Part I, October 14 QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION Shradha Chawla 1, Vivek Panwar 2 1 Department

More information

Binary Differential Evolution Strategies

Binary Differential Evolution Strategies Binary Differential Evolution Strategies A.P. Engelbrecht, Member, IEEE G. Pampará Abstract Differential evolution has shown to be a very powerful, yet simple, population-based optimization approach. The

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

Model Parameter Estimation

Model Parameter Estimation Model Parameter Estimation Shan He School for Computational Science University of Birmingham Module 06-23836: Computational Modelling with MATLAB Outline Outline of Topics Concepts about model parameter

More information

Constrained Single-Objective Optimization Using Particle Swarm Optimization

Constrained Single-Objective Optimization Using Particle Swarm Optimization 2006 IEEE Congress on Evolutionary Computation Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 Constrained Single-Objective Optimization Using Particle Swarm Optimization Karin

More information

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems FIFTH INTERNATIONAL CONFERENCE ON HYDROINFORMATICS 1-5 July 2002, Cardiff, UK C05 - Evolutionary algorithms in hydroinformatics An evolutionary annealing-simplex algorithm for global optimisation of water

More information

Modified Particle Swarm Optimization

Modified Particle Swarm Optimization Modified Particle Swarm Optimization Swati Agrawal 1, R.P. Shimpi 2 1 Aerospace Engineering Department, IIT Bombay, Mumbai, India, swati.agrawal@iitb.ac.in 2 Aerospace Engineering Department, IIT Bombay,

More information

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

More information

Simplex of Nelder & Mead Algorithm

Simplex of Nelder & Mead Algorithm Simplex of N & M Simplex of Nelder & Mead Algorithm AKA the Amoeba algorithm In the class of direct search methods Unconstrained (although constraints can be added as part of error function) nonlinear

More information

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm A. Lari, A. Khosravi and A. Alfi Faculty of Electrical and Computer Engineering, Noushirvani

More information

Lamarckian Repair and Darwinian Repair in EMO Algorithms for Multiobjective 0/1 Knapsack Problems

Lamarckian Repair and Darwinian Repair in EMO Algorithms for Multiobjective 0/1 Knapsack Problems Repair and Repair in EMO Algorithms for Multiobjective 0/ Knapsack Problems Shiori Kaige, Kaname Narukawa, and Hisao Ishibuchi Department of Industrial Engineering, Osaka Prefecture University, - Gakuen-cho,

More information

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Nasser Sadati Abstract Particle Swarm Optimization (PSO) algorithms recently invented as intelligent optimizers with several highly

More information

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This

More information

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi

More information

Sci.Int.(Lahore),28(1), ,2016 ISSN ; CODEN: SINTE 8 201

Sci.Int.(Lahore),28(1), ,2016 ISSN ; CODEN: SINTE 8 201 Sci.Int.(Lahore),28(1),201-209,2016 ISSN 1013-5316; CODEN: SINTE 8 201 A NOVEL PLANT PROPAGATION ALGORITHM: MODIFICATIONS AND IMPLEMENTATION Muhammad Sulaiman 1, Abdel Salhi 2, Eric S Fraga 3, Wali Khan

More information

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-&3 -(' ( +-   % '.+ % ' -0(+$, The structure is a very important aspect in neural network design, it is not only impossible to determine an optimal structure for a given problem, it is even impossible to prove that a given structure

More information

Particle Swarm Optimization

Particle Swarm Optimization Dario Schor, M.Sc., EIT schor@ieee.org Space Systems Department Magellan Aerospace Winnipeg Winnipeg, Manitoba 1 of 34 Optimization Techniques Motivation Optimization: Where, min x F(x), subject to g(x)

More information

RESEARCH ARTICLE. Solving Constrained Optimization Problems with a Hybrid Particle Swarm Optimization Algorithm

RESEARCH ARTICLE. Solving Constrained Optimization Problems with a Hybrid Particle Swarm Optimization Algorithm Vol. 00, No. 00, October 2008, 1 29 RESEARCH ARTICLE Solving Constrained Optimization Problems with a Hybrid Particle Swarm Optimization Algorithm Leticia Cecilia Cagnina, Susana Cecilia Esquivel 1 and

More information

A Combinatorial Algorithm for The Cardinality Constrained Portfolio Optimization Problem

A Combinatorial Algorithm for The Cardinality Constrained Portfolio Optimization Problem 0 IEEE Congress on Evolutionary Computation (CEC) July -, 0, Beijing, China A Combinatorial Algorithm for The Cardinality Constrained Portfolio Optimization Problem Tianxiang Cui, Shi Cheng, and Ruibin

More information

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION Manjeet Singh 1, Divesh Thareja 2 1 Department of Electrical and Electronics Engineering, Assistant Professor, HCTM Technical

More information

Hybrid Cuckoo Search Algorithm based on Powell Search for Constrained Engineering Design Optimization

Hybrid Cuckoo Search Algorithm based on Powell Search for Constrained Engineering Design Optimization Hybrid Cuckoo Search Algorithm based on Powell Search for Constrained Engineering Design Optimization WEN LONG Guizhou University of Finance and Economics Guizhou Key Laboratory of Economics System Simulation

More information

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India. Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Training Artificial

More information

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems 4 The Open Cybernetics and Systemics Journal, 008,, 4-9 Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems K. Kato *, M. Sakawa and H. Katagiri Department of Artificial

More information

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving

More information

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization M. Shahab Alam, M. Usman Rafique, and M. Umer Khan Abstract Motion planning is a key element of robotics since it empowers

More information

The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms

The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms Somayyeh Nalan-Ahmadabad and Sehraneh Ghaemi Abstract In this paper, pole placement with integral

More information

Reconfiguration Optimization for Loss Reduction in Distribution Networks using Hybrid PSO algorithm and Fuzzy logic

Reconfiguration Optimization for Loss Reduction in Distribution Networks using Hybrid PSO algorithm and Fuzzy logic Bulletin of Environment, Pharmacology and Life Sciences Bull. Env. Pharmacol. Life Sci., Vol 4 [9] August 2015: 115-120 2015 Academy for Environment and Life Sciences, India Online ISSN 2277-1808 Journal

More information

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Richa Agnihotri #1, Dr. Shikha Agrawal #1, Dr. Rajeev Pandey #1 # Department of Computer Science Engineering, UIT,

More information

An Evolutionary Algorithm for Minimizing Multimodal Functions

An Evolutionary Algorithm for Minimizing Multimodal Functions An Evolutionary Algorithm for Minimizing Multimodal Functions D.G. Sotiropoulos, V.P. Plagianakos and M.N. Vrahatis University of Patras, Department of Mamatics, Division of Computational Mamatics & Informatics,

More information

SwarmOps for Java. Numeric & Heuristic Optimization Source-Code Library for Java The Manual Revision 1.0

SwarmOps for Java. Numeric & Heuristic Optimization Source-Code Library for Java The Manual Revision 1.0 Numeric & Heuristic Optimization Source-Code Library for Java The Manual Revision 1.0 By Magnus Erik Hvass Pedersen June 2011 Copyright 2009-2011, all rights reserved by the author. Please see page 4 for

More information

A penalty based filters method in direct search optimization

A penalty based filters method in direct search optimization A penalty based filters method in direct search optimization ALDINA CORREIA CIICESI/ESTG P.PORTO Felgueiras PORTUGAL aic@estg.ipp.pt JOÃO MATIAS CM-UTAD Vila Real PORTUGAL j matias@utad.pt PEDRO MESTRE

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

Nelder-Mead Enhanced Extreme Learning Machine

Nelder-Mead Enhanced Extreme Learning Machine Philip Reiner, Bogdan M. Wilamowski, "Nelder-Mead Enhanced Extreme Learning Machine", 7-th IEEE Intelligent Engineering Systems Conference, INES 23, Costa Rica, June 9-2., 29, pp. 225-23 Nelder-Mead Enhanced

More information

Using CODEQ to Train Feed-forward Neural Networks

Using CODEQ to Train Feed-forward Neural Networks Using CODEQ to Train Feed-forward Neural Networks Mahamed G. H. Omran 1 and Faisal al-adwani 2 1 Department of Computer Science, Gulf University for Science and Technology, Kuwait, Kuwait omran.m@gust.edu.kw

More information

Research Article Applying Hybrid PSO to Optimize Directional Overcurrent Relay Coordination in Variable Network Topologies

Research Article Applying Hybrid PSO to Optimize Directional Overcurrent Relay Coordination in Variable Network Topologies Applied Mathematics Volume 2013, Article ID 879078, 9 pages http://dx.doi.org/10.1155/2013/879078 Research Article Applying Hybrid PSO to Optimize Directional Overcurrent Relay Coordination in Variable

More information

Constraints in Particle Swarm Optimization of Hidden Markov Models

Constraints in Particle Swarm Optimization of Hidden Markov Models Constraints in Particle Swarm Optimization of Hidden Markov Models Martin Macaš, Daniel Novák, and Lenka Lhotská Czech Technical University, Faculty of Electrical Engineering, Dep. of Cybernetics, Prague,

More information

Solving A Nonlinear Side Constrained Transportation Problem. by Using Spanning Tree-based Genetic Algorithm. with Fuzzy Logic Controller

Solving A Nonlinear Side Constrained Transportation Problem. by Using Spanning Tree-based Genetic Algorithm. with Fuzzy Logic Controller Solving A Nonlinear Side Constrained Transportation Problem by Using Spanning Tree-based Genetic Algorithm with Fuzzy Logic Controller Yasuhiro Tsujimura *, Mitsuo Gen ** and Admi Syarif **,*** * Department

More information

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization J.Venkatesh 1, B.Chiranjeevulu 2 1 PG Student, Dept. of ECE, Viswanadha Institute of Technology And Management,

More information

Department of Geography, University of North Texas, Denton, TX, USA. Online publication date: 01 April 2010 PLEASE SCROLL DOWN FOR ARTICLE

Department of Geography, University of North Texas, Denton, TX, USA. Online publication date: 01 April 2010 PLEASE SCROLL DOWN FOR ARTICLE This article was downloaded by: [Dong, Pinliang] On: 1 April 2010 Access details: Access Details: [subscription number 920717327] Publisher Taylor & Francis Informa Ltd Registered in England and Wales

More information

Artificial bee colony algorithm with multiple onlookers for constrained optimization problems

Artificial bee colony algorithm with multiple onlookers for constrained optimization problems Artificial bee colony algorithm with multiple onlookers for constrained optimization problems Milos Subotic Faculty of Computer Science University Megatrend Belgrade Bulevar umetnosti 29 SERBIA milos.subotic@gmail.com

More information

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Australian Journal of Basic and Applied Sciences, 4(8): 3366-3382, 21 ISSN 1991-8178 Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Akbar H. Borzabadi,

More information

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

A Hybrid Fireworks Optimization Method with Differential Evolution Operators A Fireworks Optimization Method with Differential Evolution Operators YuJun Zheng a,, XinLi Xu a, HaiFeng Ling b a College of Computer Science & Technology, Zhejiang University of Technology, Hangzhou,

More information

A Particle Swarm Optimization Algorithm for Solving Flexible Job-Shop Scheduling Problem

A Particle Swarm Optimization Algorithm for Solving Flexible Job-Shop Scheduling Problem 2011, TextRoad Publication ISSN 2090-4304 Journal of Basic and Applied Scientific Research www.textroad.com A Particle Swarm Optimization Algorithm for Solving Flexible Job-Shop Scheduling Problem Mohammad

More information

Introduction to unconstrained optimization - derivative-free methods

Introduction to unconstrained optimization - derivative-free methods Introduction to unconstrained optimization - derivative-free methods Jussi Hakanen Post-doctoral researcher Office: AgC426.3 jussi.hakanen@jyu.fi Learning outcomes To understand the basic principles of

More information

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem Available online at www.sciencedirect.com Procedia Engineering 15 (2011) 3673 3677 Advanced in Control Engineeringand Information Science Application of Improved Discrete Particle Swarm Optimization in

More information

PARTICLE Swarm Optimization (PSO), an algorithm by

PARTICLE Swarm Optimization (PSO), an algorithm by , March 12-14, 2014, Hong Kong Cluster-based Particle Swarm Algorithm for Solving the Mastermind Problem Dan Partynski Abstract In this paper we present a metaheuristic algorithm that is inspired by Particle

More information

Particle Swarm Optimization

Particle Swarm Optimization Particle Swarm Optimization Gonçalo Pereira INESC-ID and Instituto Superior Técnico Porto Salvo, Portugal gpereira@gaips.inesc-id.pt April 15, 2011 1 What is it? Particle Swarm Optimization is an algorithm

More information

A penalty based filters method in direct search optimization

A penalty based filters method in direct search optimization A penalty based filters method in direct search optimization Aldina Correia CIICESI / ESTG P.PORTO Felgueiras, Portugal aic@estg.ipp.pt João Matias CM-UTAD UTAD Vila Real, Portugal j matias@utad.pt Pedro

More information

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops 1 Srinivas P. S., 2 Ramachandra Raju V., 3 C.S.P Rao. 1 Associate Professor, V. R. Sdhartha Engineering College, Vijayawada 2 Professor,

More information

Tracking Changing Extrema with Particle Swarm Optimizer

Tracking Changing Extrema with Particle Swarm Optimizer Tracking Changing Extrema with Particle Swarm Optimizer Anthony Carlisle Department of Mathematical and Computer Sciences, Huntingdon College antho@huntingdon.edu Abstract The modification of the Particle

More information

ATI Material Do Not Duplicate ATI Material. www. ATIcourses.com. www. ATIcourses.com

ATI Material Do Not Duplicate ATI Material. www. ATIcourses.com. www. ATIcourses.com ATI Material Material Do Not Duplicate ATI Material Boost Your Skills with On-Site Courses Tailored to Your Needs www.aticourses.com The Applied Technology Institute specializes in training programs for

More information

Direct Search Firefly Algorithm for Solving Global Optimization Problems

Direct Search Firefly Algorithm for Solving Global Optimization Problems Appl. Math. Inf. Sci. 10, No. 3, 841-860 (2016) 841 Applied Mathematics & Information Sciences An International Journal http://dx.doi.org/10.18576/amis/100304 Direct Search Firefly Algorithm for Solving

More information

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search A JOB-SHOP SCHEDULING PROBLEM (JSSP) USING GENETIC ALGORITHM (GA) Mahanim Omar, Adam Baharum, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains Malaysia 11800 Penang, Malaysia Tel: (+)

More information

Fuzzy Optimization of the Constructive Parameters of Laboratory Fermenters

Fuzzy Optimization of the Constructive Parameters of Laboratory Fermenters This article was downloaded by: [Bulgarian Academy of Sciences] On: 07 April 2015, At: 00:04 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation: Convergence of PSO The velocity update equation: v i = v i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) for some values of φ 1 and φ 2 the velocity grows without bound can bound velocity to range [ V max,v

More information

Chapter 14 Global Search Algorithms

Chapter 14 Global Search Algorithms Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.

More information

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION Kamil Zakwan Mohd Azmi, Zuwairie Ibrahim and Dwi Pebrianti Faculty of Electrical

More information

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems Hongbo Liu 1,2,AjithAbraham 3,1, Okkyung Choi 3,4, and Seong Hwan Moon 4 1 School of Computer

More information

A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization

A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization International Journal of Soft Computing and Engineering (IJSCE) ISSN: 2231-2307, Volume-3, Issue-6, January 2014 A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization

More information

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les Chaotic Crossover Operator on Genetic Algorithm Hüseyin Demirci Computer Engineering, Sakarya University, Sakarya, 54187, Turkey Ahmet Turan Özcerit Computer Engineering, Sakarya University, Sakarya, 54187,

More information

A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

A Comparative Study of Genetic Algorithm and Particle Swarm Optimization IOSR Journal of Computer Engineering (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727 PP 18-22 www.iosrjournals.org A Comparative Study of Genetic Algorithm and Particle Swarm Optimization Mrs.D.Shona 1,

More information

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Search & Optimization Search and Optimization method deals with

More information

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'

More information

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China

More information

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES SHIHADEH ALQRAINY. Department of Software Engineering, Albalqa Applied University. E-mail:

More information

Unidimensional Search for solving continuous high-dimensional optimization problems

Unidimensional Search for solving continuous high-dimensional optimization problems 2009 Ninth International Conference on Intelligent Systems Design and Applications Unidimensional Search for solving continuous high-dimensional optimization problems Vincent Gardeux, Rachid Chelouah,

More information

PARALLELIZATION OF THE NELDER-MEAD SIMPLEX ALGORITHM

PARALLELIZATION OF THE NELDER-MEAD SIMPLEX ALGORITHM PARALLELIZATION OF THE NELDER-MEAD SIMPLEX ALGORITHM Scott Wu Montgomery Blair High School Silver Spring, Maryland Paul Kienzle Center for Neutron Research, National Institute of Standards and Technology

More information

COMPARISON OF ALGORITHMS FOR NONLINEAR REGRESSION ESTIMATES

COMPARISON OF ALGORITHMS FOR NONLINEAR REGRESSION ESTIMATES COMPSTAT 2004 Symposium c Physica-Verlag/Springer 2004 COMPARISON OF ALGORITHMS FOR NONLINEAR REGRESSION ESTIMATES Tvrdík J. and Křivý I. Key words: Global optimization, evolutionary algorithms, heuristics,

More information

Particle Swarm Optimization Based Approach for Location Area Planning in Cellular Networks

Particle Swarm Optimization Based Approach for Location Area Planning in Cellular Networks International Journal of Intelligent Systems and Applications in Engineering Advanced Technology and Science ISSN:2147-67992147-6799 www.atscience.org/ijisae Original Research Paper Particle Swarm Optimization

More information

International Conference on Modeling and SimulationCoimbatore, August 2007

International Conference on Modeling and SimulationCoimbatore, August 2007 International Conference on Modeling and SimulationCoimbatore, 27-29 August 2007 OPTIMIZATION OF FLOWSHOP SCHEDULING WITH FUZZY DUE DATES USING A HYBRID EVOLUTIONARY ALGORITHM M.S.N.Kiran Kumara, B.B.Biswalb,

More information

Parameter Selection of a Support Vector Machine, Based on a Chaotic Particle Swarm Optimization Algorithm

Parameter Selection of a Support Vector Machine, Based on a Chaotic Particle Swarm Optimization Algorithm BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 5 No 3 Sofia 205 Print ISSN: 3-9702; Online ISSN: 34-408 DOI: 0.55/cait-205-0047 Parameter Selection of a Support Vector Machine

More information

SwarmOps for C# Numeric & Heuristic Optimization Source-Code Library for C# The Manual Revision 3.0

SwarmOps for C# Numeric & Heuristic Optimization Source-Code Library for C# The Manual Revision 3.0 Numeric & Heuristic Optimization Source-Code Library for C# The Manual Revision 3.0 By Magnus Erik Hvass Pedersen January 2011 Copyright 2009-2011, all rights reserved by the author. Please see page 4

More information

CHAPTER 1 INTRODUCTION

CHAPTER 1 INTRODUCTION 1 CHAPTER 1 INTRODUCTION 1.1 OPTIMIZATION OF MACHINING PROCESS AND MACHINING ECONOMICS In a manufacturing industry, machining process is to shape the metal parts by removing unwanted material. During the

More information

Chapter II. Linear Programming

Chapter II. Linear Programming 1 Chapter II Linear Programming 1. Introduction 2. Simplex Method 3. Duality Theory 4. Optimality Conditions 5. Applications (QP & SLP) 6. Sensitivity Analysis 7. Interior Point Methods 1 INTRODUCTION

More information

Cooperative Coevolution using The Brain Storm Optimization Algorithm

Cooperative Coevolution using The Brain Storm Optimization Algorithm Cooperative Coevolution using The Brain Storm Optimization Algorithm Mohammed El-Abd Electrical and Computer Engineering Department American University of Kuwait Email: melabd@auk.edu.kw Abstract The Brain

More information

OPTIMIZATION FOR SURFACE ROUGHNESS, MRR, POWER CONSUMPTION IN TURNING OF EN24 ALLOY STEEL USING GENETIC ALGORITHM

OPTIMIZATION FOR SURFACE ROUGHNESS, MRR, POWER CONSUMPTION IN TURNING OF EN24 ALLOY STEEL USING GENETIC ALGORITHM Int. J. Mech. Eng. & Rob. Res. 2014 M Adinarayana et al., 2014 Research Paper ISSN 2278 0149 www.ijmerr.com Vol. 3, No. 1, January 2014 2014 IJMERR. All Rights Reserved OPTIMIZATION FOR SURFACE ROUGHNESS,

More information

A Hybrid Genetic Pattern Search Augmented Lagrangian Method for Constrained Global Optimization

A Hybrid Genetic Pattern Search Augmented Lagrangian Method for Constrained Global Optimization A Hybrid Genetic Pattern Search Augmented Lagrangian Method for Constrained Global Optimization Lino Costa a, Isabel A.C.P. Espírito Santo a, Edite M.G.P. Fernandes b a Department of Production and Systems

More information

A Genetic Algorithm Based Hybrid Approach for Reliability- Redundancy Optimization Problem of a Series System with Multiple- Choice

A Genetic Algorithm Based Hybrid Approach for Reliability- Redundancy Optimization Problem of a Series System with Multiple- Choice https://dx.doi.org/10.33889/ijmems.017..3-016 A Genetic Algorithm Based Hybrid Approach for Reliability- Redundancy Optimization Problem of a Series System with Multiple- Choice Asoke Kumar Bhunia Department

More information

Modern Methods of Data Analysis - WS 07/08

Modern Methods of Data Analysis - WS 07/08 Modern Methods of Data Analysis Lecture XV (04.02.08) Contents: Function Minimization (see E. Lohrmann & V. Blobel) Optimization Problem Set of n independent variables Sometimes in addition some constraints

More information

A Derivative-Free Filter Driven Multistart Technique for Global Optimization

A Derivative-Free Filter Driven Multistart Technique for Global Optimization A Derivative-Free Filter Driven Multistart Technique for Global Optimization Florbela P. Fernandes 1,3, M. Fernanda P. Costa 2,3, and Edite M. G. P. Fernandes 4 1 Polytechnic Institute of Bragança, ESTiG,

More information

EE 553 Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over

EE 553 Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over EE Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over Emre Uğur February, 00 Abstract In this work, Particle Swarm Optimization (PSO) method is implemented and applied to various

More information

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat An Improved Firefly Algorithm for Optimization Problems Amarita Ritthipakdee 1, Arit Thammano, Nol Premasathian 3, and Bunyarit Uyyanonvara 4 Abstract Optimization problem is one of the most difficult

More information

SwarmOps for Matlab. Numeric & Heuristic Optimization Source-Code Library for Matlab The Manual Revision 1.0

SwarmOps for Matlab. Numeric & Heuristic Optimization Source-Code Library for Matlab The Manual Revision 1.0 Numeric & Heuristic Optimization Source-Code Library for Matlab The Manual Revision 1.0 By Magnus Erik Hvass Pedersen November 2010 Copyright 2009-2010, all rights reserved by the author. Please see page

More information

Simple meta-heuristics using the simplex algorithm for non-linear programming

Simple meta-heuristics using the simplex algorithm for non-linear programming Simple meta-heuristics using the simplex algorithm for non-linear programming João Pedro Pedroso Technical Report Series: DCC-2007-6 http://www.dcc.fc.up.pt/pubs/ Departamento de Ciência de Computadores

More information

Solving Economic Load Dispatch Problems in Power Systems using Genetic Algorithm and Particle Swarm Optimization

Solving Economic Load Dispatch Problems in Power Systems using Genetic Algorithm and Particle Swarm Optimization Solving Economic Load Dispatch Problems in Power Systems using Genetic Algorithm and Particle Swarm Optimization Loveleen Kaur 1, Aashish Ranjan 2, S.Chatterji 3, and Amod Kumar 4 1 Asst. Professor, PEC

More information

Parameter Estimation of DC Motor using Adaptive Transfer Function based on Nelder-Mead Optimisation

Parameter Estimation of DC Motor using Adaptive Transfer Function based on Nelder-Mead Optimisation Indonesian Journal of Electrical Engineering and Computer Science Vol. 9, No. 3, March 2018, pp. 696~702 ISSN: 2502-4752, DOI: 10.11591/ijeecs.v9.i3.pp696-702 696 Parameter Estimation of DC Motor using

More information

Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization

Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization K. Tang 1, X. Yao 1, 2, P. N. Suganthan 3, C. MacNish 4, Y. P. Chen 5, C. M. Chen 5, Z. Yang 1 1

More information