CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

Size: px
Start display at page:

Download "CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION"

Transcription

1 131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This chapter deals with a new variant of PSO named Orthogonal PSO (OPSO) for solving the multiprocessor scheduling problem. The objective of applying the orthogonal concept in the basic PSO algorithm is to enhance the performance when applied to the scheduling problem. The orthogonal concept is used in PSO to generate an initial population of points that are scattered uniformly over the feasible solution space, so that the algorithm can evenly scan the feasible solution space to locate good points for further exploration in subsequent iterations. 6.2 ORTHOGONAL DESIGN An orthogonal array is a fractional factorial array that assures a balanced comparison of levels of any factor or interaction of factors. In the context of experimental arrays, orthogonal means statistically independent. The array is called orthogonal because all columns can be evaluated independently of one another and the main effect of one factor does not bother the estimation of the main effect of another factor. Orthogonal design is applicable to discrete variables, but is not applicable to continuous variables (Yiu Wing Leung and Yuping Wang

2 ). Before solving an optimization problem, no information about the location of the global optima is known. It is desirable that an algorithm starts to explore those points that are scattered evenly in the feasible solution space. In this manner, the algorithm can evenly scan the feasible solution space only once to locate the best points for further exploration in subsequent iterations. As the algorithm iterates and improves the quality of the solution, some points may move closer to the global optima. Hence the orthogonal design technique is used to generate this initial population. Yiu Wing Leung and Yuping Wang (2001) designed a genetic algorithm called the orthogonal genetic algorithm with quantization for global numerical optimization with continuous variables. A quantization technique is also proposed to complement an experimental design method called orthogonal design. The resulting methodology is applied to generate an initial population of points that are uniformly scattered over the feasible solution space. In addition, the quantization technique and the orthogonal design are applied to tailor a new crossover operator, such that this crossover operator can generate a small, but representative sample of points as the potential offspring. The proposed algorithm is tested for 15 benchmark problems with 30 or 100 dimensions and the results prove that the proposed method has found optimal or near optimal solutions. Shinn-Ying Ho et al (2004) proposed two intelligent evolutionary algorithms Intelligent Evolutionary Algorithms (IEA) and Intelligent Multiobjective Evolutionary Algorithms (IMOEA) using a novel Intelligent Gene Collector (IGC) to solve single and multiobjective large parameter optimization problems. IGC is the main phase in an intelligent recombination operator of IEA and IMOEA. Based on orthogonal experimental design, IGC uses a divide-and-conquer approach. IMOEA utilizes a novel generalized Pareto-based scale-independent fitness function for efficiently finding a set of

3 133 Pareto-optimal solutions to a multiobjective optimization problem. The IEA and IMOEA algorithms have high performance in solving benchmark functions comprising many parameters, as compared with existing Evolutionary Algorithms. Li-Sun Shu et al (2004) proposed a novel Orthogonal Simulated Annealing algorithm (OSA) for optimization of electromagnetic problems. The algorithm performs best when it employs an intelligent generation mechanism based on orthogonal experimental design (OED). The OED-based intelligent generation mechanism can efficiently generate a good candidate solution for the next step by using a systematic reasoning method instead of the conventional method of random perturbation. The authors claim that the OSA is more efficient in solving parametric optimization problems and in designing optimal electromagnetic devices than some existing optimization methods using simulated annealing algorithms and genetic algorithms. Jenn Long Liu and Chao Chun Chang (2008) proposed an orthogonal momentum-type particle swarm optimization (PSO) that finds good solutions to global optimization problems using a delta momentum rule to update the flying velocity of particles and incorporating a Fractional Factorial Design (FFD) via several factorial experiments to determine the best position of particles. The novel combination of the momentum-type PSO and FFD is termed as the momentum-type PSO with FFD herein. The momentumtype PSO modifies the velocity-updating equation of the original Kennedy and Eberhart PSO, and the FFD incorporates classical orthogonal arrays into a velocity-updating equation for analyzing the best factor associated with cognitive learning and social learning terms. Twelve widely used large parameter optimization problems are used to evaluate the performance of the proposed PSO with the original PSO, momentum-type PSO, and original PSO with FFD. Experimental results reveal that the proposed momentum-type PSO

4 134 with an FFD algorithm efficiently solves large parameter optimization problems. Shinn-Ying Ho et al (2008) proposed a novel variant of particle swarm optimization named Orthogonal Particle Swarm Optimization (OPSO) for solving intractable large parameter optimization problems. The standard version of PSO is associated with the lack of a mechanism responsible for the process of high dimensional vector spaces. The high performance of OPSO arises mainly from a novel move behavior using an Intelligent Move Mechanism (IMM) which applies orthogonal experiment design to adjust a velocity for each particle by using a systematic reasoning method instead of the conventional generate-and-go-method. The IMM uses a divide-andconquer approach to cope with the curse of dimensionality in determining the next move of particles. The OPSO with IMM is more specialized than the PSO and performs well on large-scale parameter optimization problems with few interactions between variables. Further the OPSO with IMM technique is also tested for the Task Assignment problem with up to 300 nodes and the results prove that the proposed technique performs well when compared to the normal PSO and the GA methods Construction of an Orthogonal Array Different orthogonal arrays are needed for different optimization problems as mentioned in the literature. In general, when there are N factors and Q levels per factor, there are Q N combinations. When N and Q are large, it may not be possible to do all Q N experiments. Therefore, it is desirable to sample a small, but representative set of combinations for experimentation. The orthogonal design provides a series of orthogonal arrays for different N and Q. Let L M (Q N ) be an orthogonal array for N factors and Q levels, where L denotes a Latin square and M is the number of combinations of levels. It has M rows, where every row represents a combination of levels. For

5 135 convenience, denote L M {Q N ) = [a i,j ] MxN where the j th factor in the i th combination has level a i,j and a i,j {1, 2,..., Q}. A special case of orthogonal arrays L M (Q N ) is used where Q is odd and M = Q J where J is a positive integer fulfilling and is represented in equation 6.1. J N ( Q 1) /( Q 1). (6.1) A simple permutation method is used to construct orthogonal arrays of this class (Yiu Wing Leung and Yuping Wang 2001). The j th column of the orthogonal array [a i, j ] MXN is denoted as a j. Columns a j for j = 1, 2, (Q 2 1) /( Q 1) +1, ( Q 3 1) /( Q 1) +1.. ( Q J 1 1) /( Q 1) +1 are called the basic columns and the others are called the non-basic columns. The basic columns are constructed first and then the nonbasic columns are constructed. The details are as follows, Step 1 : Construct the basic columns as follows: for k = 1 to J do begin k j = (Q 1 1) /( Q 1) + 1; for i = 1 to J Q do j k a i, j = [ (i-1)/( Q ) ] mod Q; end. Step 2: Construct the non-basic columns as follows: for k = 2 to J do begin k j = ( Q 1 1) /(Q 1) + 1; for s = 1 to j -1 do

6 136 for t = 1 to Q-1 do a ( a xt a )mod Q j ( s 1 )( Q 1 ) t s j ; End Step 3 : Increment a i, j by one for all 1 i M and 1 j N Concatenate all the a j elements which is the orthogonal array. properties. In general, the orthogonal array L M (Q N ) has the following 1) For the factor in any column, every level occurs M/Q times. 2) For the two factors in any two columns, every combination of two levels occurs M/Q 2 times. 3) For the two factors in any two columns, the M combinations contain the following combinations of levels: (1, 1), (1, 2),, (1, Q), (2, 1), (2, 2),, (2, Q),...,(Q, 1), (Q, 2),...,(Q, Q). 4) If any two columns of an orthogonal array are swapped, the resulting array is still an orthogonal array. 5) If some columns are taken away from an orthogonal array, the resulting array is still an orthogonal array with a smaller number of factors. Consequently, the selected combinations are scattered uniformly over the space of all possible combinations. Orthogonal design is proven to be optimal for additive and quadratic models (Yiu Wing Leung et al 2001) and

7 137 the selected combinations are good representatives for all of the possible combinations. 6.3 PROPOSED OPSO ALGORITHM This chapter proposes a first method for task scheduling namely the Orthogonal Particle Swarm Optimization Technique. The procedure for Orthogonal PSO is as follows, 1. Generate the initial swarm randomly. 2. Construct the Orthogonal Array for the initial swarm as mentioned in Initialize the personal best of each particle and the global best of the entire swarm. 4. Evaluate the initial swarm using the fitness function. 5. Select the personal best and global best of the swarm. 6. Update the velocity and the position of each particle using the equations (2.1) and (2. 2). 7. Obtain the optimal solution in the initial stage. 8. Repeat step 2- step 7 until the maximum number of iterations specified. 9. Obtain the optimal solution at the end of the specified iteration. 6.4 PROPOSED PARALLEL OPSO ALGORITHM The second method proposed is the solution for multiprocessor scheduling using the Asynchronous Orthogonal Particle Swarm Optimization

8 138 technique. The Asynchronous PSO is better than the Synchronous PSO and the results are justified in Chapter 5. The POPSO is implemented using a master-slave approach. Initially the orthogonal array particles are generated by the master as highlighted in the section The master processor holds the queue of feasible particles to be sent to the slave processors. The master performs all decision making processes such as velocity updation, position updation and convergence checks. The slaves perform the function evaluations for the particles sent to them. The tasks performed by the master and slave processors are as follows, Master processor 1. Initialize all optimization parameters and particle positions and velocities. 2. Holds a queue of orthogonal array particles for the slave processors to evaluate. 3. Updates the particle positions and velocities based on the currently available information. 4. Sends the next particle in the queue to an available slave processor. 5. Receives cost function values from slave processors. 6. Checks convergence. Slave Processor 1. Receives the particle from the master processor. 2. Evaluates the objective function of the particle sent to the slaves. 3. Sends the cost function value to the master processor.

9 139 The proposed OPSO and the Parallel OPSO techniques are tested for the multiprocessor scheduling problem. Static task scheduling (Independent and Dependent tasks) and dynamic task scheduling (with and without load balancing) problems are simulated in a Java environment. Benchmark datasets for independent and dynamic task scheduling are taken from Eric Taillard s site. The data for dependent task scheduling are taken from the Standard Task Graph dataset. Two data sets are taken for simulation. Data set 1 involves 50 tasks and 20 processors. Data set 2 involves 100 tasks and 20 processors. The tasks are non pre-emptive in nature. The number of iterations and the population size is taken as twice the number of tasks taken for scheduling (Ayed Salman et al 2002). In a heuristic approach, every independent run of a program generates a different solution. Thus 20 independent runs are executed and the average, best and worst solutions are taken for comparison. The topology adopted is the global best topology in which every particle is connected to every other particle in the search space. The following sections deal with the various types of task scheduling. 6.5 SCHEDULING STATIC INDEPENDENT TASKS The Illustration1 deals with the scheduling of static tasks which are independent in nature. In this method, the tasks are independent of one another and any task can be executed in any order. The objective function is the same as specified in equation 2.7.

10 140 Table 6.1 Convergence time, Best, Worst and Average s of the OPSO algorithms for Independent Task Schedule Method PSO-VI OPSO POPSO m Best Worst Average Convergence time in seconds Table 6.1 implies that the POPSO outperforms all other methods tested for multiprocessor scheduling. The Best cost obtained for dataset1 is 1148 for OPSO and POPSO method, whereas the cost is 1612 for PSO-VI method. For dataset2, the Best cost is 2846 for OPSO and POPSO methods, whereas the cost is 3928 for PSO-VI method. The average cost is also improved in the case of the OPSO and the POPSO methods. The time for convergence is 1.1 times slower than the PSO-VI method. But the convergence time for the POPSO method is 2.4 times faster than the PSO-VI method, because of the parallel asynchronous nature of the algorithm. for OPSO and POPSO for Independent Task Schedule for 50 tasks and 20 processors in Rupees PSO-VI OPSO POPSO Mehtods Figure 6.1 Best cost for Independent Task Schedule for 50 tasks and 20 processors for OPSO and POPSO methods

11 141 for OPSO and POPSO for Independent Task Schedule for 100 tasks and 20 processors in Rupees PSO-VI OPSO POPSO Methods Figure 6.2 Best cost for Independent Task Schedule for 100 tasks and 20 processors for OPSO and POPSO methods Figure 6.1 and Figure 6.2 illustrates the Best cost obtained for dataset1 and dataset2 respectively. There is a significant improvement in the result because of the asynchronous implementation of the Parallel Orthogonal Particle Swarm Optimization Algorithm. Table 6.2 Efficiency Calculation for Independent Task Scheduling PSO-VI and OPSO PSO-VI and POPSO OPSO POPSO ( 1 )x100 ( 1 )x100 PSO VI PSO VI Data set I 27.04% 29.42% Data set II 27.21% 29% In terms of efficiency, the POPSO performs better than all other methods tested and is illustrated in Table 6.2. When the PSO-VI and the OPSO method are compared, the OPSO method is 27.04% efficient than the PSO-VI method for 50 tasks and 20 processors and the OPSO method is 27.21% efficient than the PSO-VI method for 100 tasks and 20 processors.

12 142 When the PSO-VI and the POPSO methods are compared, the POPSO method performs better than the PSO-VI method. The POPSO method is 29.42% efficient than the PSO-VI method for 50 tasks and 20 processors. The POPSO method is 29% efficient than the PSO-VI method when 100 tasks and 20 processors are involved. At the outset, the result infers that the POPSO performs better than the OPSO method when applied to the task assignment problem which involves independent tasks. 6.6 SCHEDULING STATIC DEPENDENT TASKS The illustration2 deals with the scheduling of tasks which are dependent in nature. In this method, there is a dependency among the tasks to be scheduled. The dependent tasks should be scheduled in a sequential manner so that the order of the dependency is satisfied. The objective of the methodology is to minimize the make span of the entire schedule. Table 6.3 Convergence time, Best, Worst and Average costs of the Orthogonal PSO algorithms for dependent task schedule Method PSO-VI OPSO POPSO m Best Worst Average Convergence time in seconds Table 6.3 infers that the performance of the POPSO method is better than the OPSO and the PSO-VI methods. The Best cost obtained for dataset1 and dataset2 is the same for the POPSO and the OPSO method which is 921 and 3542 respectively. But both the methods differ in the convergence

13 143 time. The POPSO converges faster (2.4 times) than the PSO-VI method, but the OPSO method converges slower (1.1 times) than the PSO-VI method. This is because of the asynchronous nature of the parallel algorithm. The OPSO is slower than the PSO-VI method because of the refinement in the initial population taking place due to the orthogonal principle. for OPSO and POPSO for Dependent Task Schedule for 50 tasks and 20 processors in Rupees PSO-VI OPSO POPSO Methods Figure 6.3 Best cost for Dependent Task Schedule for 50 tasks and 20 processors for OPSO and POPSO methods for OPSO and POPSO for Dependent Task Schedule for 100 tasks and 20 processors 5000 in Rupees PSO-VI OPSO POPSO Methods Figure 6.4 Best cost for Dependent Task Schedule for 100 tasks and 20 processors for OPSO and POPSO methods

14 144 Figure 6.3 and Figure 6.4 represents the Best cost achieved for dataset1 and dataset2 respectively. POPSO method is better because of the orthogonal principle combined with the parallel asynchronous concept. In terms of efficiency, the POPSO method outperforms all other methods tested and is illustrated in Table 6.4. When the PSO-VI and the OPSO method are compared, the OPSO method is 27.16% efficient than the PSO-VI method for 50 tasks and 20 processors and the OPSO method is 27% efficient than the PSO-VI method for 100 tasks and 20 processors. Table 6.4 Efficiency Calculation for Dependent Task Scheduling PSO-VI & OPSO PSO-VI & POPSO OPSO POPSO ( 1 )x100 ( 1 )x100 PSO VI PSO VI Data set I 27.16% 29.23% Data set II 27% 29.06% When the PSO-VI and the POPSO methods are compared, the POPSO method performs better than the PSO-VI method. The POPSO method is 29.23% efficient than the PSO-VI method for 50 tasks and 20 processors. The POPSO method is 29.06% efficient than the PSO-VI method when 100 tasks and 20 processors are involved. At the outset, the result infers that the POPSO performs better than the OPSO method when applied to the task assignment problem which involves dependent tasks.

15 DYNAMIC TASK SCHEDULING The illustration3 deals with the tasks which are dynamic in nature. To achieve minimum cost for the Task Assignment Problem for dynamic task scheduling, the function is formulated as represented in equation 2.9 and equation 2.10.The objective function calculates the total execution time of the set of tasks allocated to each processor. Table 6.5 Convergence time, Best, Worst and Average costs of the Orthogonal PSO algorithms for dynamic task schedule Method PSO-VI OPSO POPSO m Best Worst Average Convergence time in seconds From Table 6.5, it can be inferred that the Best cost is the same for the OPSO and the POPSO methods for dataset1 and dataset2. But the convergence time is faster in the case of the POPSO method than the OPSO method. The convergence time for the POPSO method is 2.4 times faster than the PSO-VI method, but the convergence time is 1.1 times slower in the OPSO method when compared to the POPSO method. The average cost is improved in both the OPSO and the POPSO method when compared to the PSO-VI method.

16 146 for OPSO and POPSO for Dynamic Task Schedule for 50 tasks and 20 processors in Rupees PSO-VI OPSO POPSO Methods Figure 6.5 Best cost for Dynamic Task Schedule for 50 tasks and 20 processors for OPSO and POPSO methods From Figure 6.5, Figure 6.6 and Table 6.3, it can be inferred that the POPSO outperforms OPSO and the Normal PSO with variable inertia methods. This is because of the combination of the orthogonal and the parallel principle in the proposed POPSO algorithm for OPSO and POPSO for Dynamic Task Schedule for 100 tasks and 20 processors in Rupees PSO-VI OPSO POPSO Methods Figure 6.6 Best cost for Dynamic Task Schedule for 100 tasks and 20 processors for OPSO and POPSO methods

17 147 In terms of efficiency, the POPSO method performs better than all the other methods tested. When the PSO-VI and the OPSO method are compared, the OPSO method is 27.22% efficient than the PSO-VI method for 50 tasks and 20 processors and the OPSO method is 27.14% efficient than the PSO-VI method for 100 tasks and 20 processors. Table 6.6 Efficiency Calculation for Dynamic Task Scheduling PSO-VI and OPSO PSO-VI and POPSO OPSO POPSO ( 1 )x100 ( 1 )x100 PSO VI PSO VI Data set I 27.22% 29.11% Data set II 27.14% 29.32% When the PSO-VI and the POPSO methods are compared, the POPSO method performs better than the PSO-VI method. The POPSO method is 29.11% efficient than the PSO-VI method for 50 tasks and 20 processors. The POPSO method is 29.32% efficient than the PSO-VI method when 100 tasks and 20 processors are involved. At the outset, the result infers that the POPSO performs better than the OPSO method when applied to the task assignment problem which involves dynamic tasks. 6.8 DYNAMIC TASK SCHEDULING WITH LOAD BALANCING The illustration 4 deals with the dynamic task scheduling with load balancing concept. Effective processor utilization is needed to support the concept of load balancing. Thus the concept of load balancing is dealt, in which the objective function is the same as represented in equation 2.12, 2.13 and 2.14.

18 148 Table 6.7 Convergence time, Best, Worst and Average costs of the Orthogonal PSO algorithms for dynamic task schedule with load balancing Method PSO-VI OPSO POPSO m Best Worst Average Convergence time in seconds for OPSO and POPSO for Dynamic Task Schedule with Load balancing for 50 tasks and 20 processors in Rupees PSO-VI OPSO POPSO Methods Figure 6.7 Best cost for Dynamic Task Schedule with Load Balancing for 50 tasks and 20 processors for OPSO and POPSO methods The Best cost achieved for the dataset1 and dataset2 is same for both the proposed methods namely the OPSO and the POPSO methods and is illustrated in Table 6.7. But the convergence time is faster (2.4 times) in the POPSO method and is slower (1.1 times) for the OPSO method when compared to the PSO-VI methods.

19 149 for dynamic task schedule with load balancing for 100 tasks and 20 processors in Rupees PSO-VI OPSO POPSO Methods Figure 6.8 Best cost for Dynamic Task Schedule with Load balancing for 100 tasks and 20 processors for OPSO and POPSO methods From Figure 6.7 and Figure 6.8, it can be inferred that the Parallel Orthogonal PSO outperforms the Orthogonal PSO and the PSO with varying inertia concepts. Table 6.8 Efficiency Calculation for Dynamic Task Scheduling with Load Balancing PSO-VI & OPSO PSO-VI & POPSO OPSO POPSO ( 1 )x100 ( 1 )x100 PSO VI PSO VI Data set I 27.41% 29% Data set II 27.06% 29.23% In terms of efficiency, the POPSO outperforms all other methods tested. When the PSO-VI and the OPSO method are compared, the OPSO method is 27.41% efficient than the PSO-VI method for 50 tasks and 20 processors and the OPSO method is 27.06% efficient than the PSO-VI method for 100 tasks and 20 processors and is represented in Table 6.8.

20 150 When the PSO-VI and the POPSO methods are compared, the POPSO method performs better than the PSO-VI method. The POPSO method is 29% efficient than the PSO-VI method for 50 tasks and 20 processors. The POPSO method is 29.23% efficient than the PSO-VI method when 100 tasks and 20 processors are involved. At the outset, the result infers that the POPSO performs better than the OPSO method when applied to the task assignment problem which involves dynamic tasks with load balancing. 6.9 SUMMARY This chapter has dealt with the application of Orthogonal and the Parallel Asynchronous Orthogonal PSO techniques to different types of task scheduling namely, static independent task scheduling, static dependent task scheduling, dynamic task scheduling and dynamic task scheduling with load balancing. The results infer that the Parallel Asynchronous Orthogonal PSO is better in performance when compared to the Orthogonal PSO and the PSO with varying inertia approaches. In terms of the Best cost, both the OPSO and the POPSO methods perform the same. When the average cost is considered, the OPSO method is 27% efficient than the PSO-VI method and the POPSO method is 29% efficient than the PSO-VI method. Both the method differs a lot in terms of the convergence time. The convergence time in the case of the POPSO method is 2.4 times faster than the PSO-VI method. The convergence is faster because of the asynchronous parallel version of the orthogonal PSO algorithm. The convergence time for the OPSO method is slower (1.1 times) when compared to the PSO-VI method because of the time taken to refine the initial population using the orthogonal principle. Thus the POPSO method performs better when compared to the other methods tested when applied to the multiprocessor scheduling problem.

21 151 CHAPTER 7 CONCLUSION 7.1 CONCLUSION This thesis involves the application of the various PSO techniques to solve the multiprocessor scheduling problem. In this work, the PSO and its variants namely PSO with dynamically varying inertia, Elite PSO with mutation, Hybrid PSO, Parallel PSO, Orthogonal PSO and Parallel Orthogonal PSO techniques are investigated to solve the multiprocessor scheduling problem. The PSO technique is also compared with the GA approach. Four types of task scheduling are dealt with namely, static independent task scheduling, static dependent task scheduling, dynamic task scheduling and dynamic task scheduling with load balancing. The introduction of inertia factor in the basic equation of the PSO algorithm has proven a significant improvement in the results when applied to the multiprocessor scheduling problem. The value of the inertia factor plays a major role in the achievement of the optimal solution. The fixed inertia and dynamically varying inertia is applied to solve the task assignment problem. PSO with variable inertia yields a better performance than fixed inertia when applied to multiprocessor scheduling problem. The proposed PSO-VI method yields an improved performance when compared with the GA method. On an average, the proposed PSO-VI method is 1.7 times faster than the GA method. The proposed PSO-VI method is on an average 14% efficient than the GA method in terms of cost.

22 152 The proposed PSO-VI method s performance is enhanced by modifying the basic working of the PSO method. The PSO with varying inertia method is also combined with another proposed technique known as elitism to achieve an improvement in the result when applied to the task assignment problem. Elitism is also combined with mutation to prevent the algorithm from being stuck at local optimum. EPSO-M algorithm improves the individual quality of the swarm and accelerates the convergence. Mutation operation is used to guarantee the diversity of the swarm. The proposed EPSO-M algorithm is on an average 7% better than the variable inertia PSO when applied to the task scheduling problem. The time for convergence is on average 1.12% faster than the PSO-VI method. Thus the cost and the convergence time is improved in EPSO-M method when compared to the PSO-VI method. Further, hybridization of the PSO algorithm and the Simulated Annealing (SA) algorithm is done to enhance the performance of the algorithm when applied to the multiprocessor scheduling problem. Simulated Annealing is chosen because it is good at finding at local optimum. Performance improvement is achieved when hybridization technique of PSO and SA is applied to multiprocessor scheduling. This hybrid technique is also compared with another version of hybridized PSO namely the combination of the PSO and the Hill Climbing concept. On an average the proposed PSO-SA method is 13% efficient than the PSO-VI method and the PSO-HC method is 4% efficient than the PSO-VI method. But in the proposed PSO-SA method, there is an increase in the convergence time (1.5 times) of the algorithm when compared to the PSO-VI method because of the involvement of the annealing schedule in simulated annealing and the PSO algorithms. Parallelization of the PSO algorithm is also proposed to speed up the execution and to provide concurrency. Two versions of parallelization are

23 153 done namely the Synchronous Parallel PSO and the Asynchronous Parallel PSO. The result infers that the Asynchronous version performs better than the Synchronous Parallel version. The Synchronous Parallel PSO and the Asynchronous Parallel PSO yields the same Best cost as the Hybrid PSO (PSO-SA) approach, but the convergence of Parallel PSO is faster than the PSO-SA method. The PAPSO converges faster than the PSPSO because the idle time of the processors is considerably reduced. Also the convergence time is 2.2 times faster in the PAPSO method when compared to the PSO-VI method. The convergence time is 1.3 times faster in the PSPSO method when compared to the PAPSO method. When the average cost is considered, the PSPSO method is around 14% efficient than the PSO-VI method and the PAPSO method is 18% efficient than the PSO-VI method when applied to the different types of task scheduling. Further the Orthogonal PSO (OPSO) is proposed which is used to refine the initial population. The parallelization of the OPSO algorithm (POPSO) is also proposed to further refine the results. In terms of the Best cost, both the OPSO and the POPSO methods perform the same. When the average cost is considered, the OPSO method is 27% efficient than the PSO- VI method and the POPSO method is 29% efficient than the PSO-VI method. Both the method differs a lot in terms of the convergence time. The convergence time is faster (2.4 times) in the case of the POPSO method when compared to the PSO-VI method. The convergence is faster because of the asynchronous parallel version of the orthogonal PSO algorithm. The convergence time for the OPSO method is slower (1.1 times) when compared to the PSO-VI method because of the time taken to refine the initial population using the orthogonal principle. Thus the POPSO method performs better when compared to the other methods tested when applied to the multiprocessor scheduling problem. The POPSO outperforms all other

24 154 methods proposed in this thesis for solving the multiprocessor scheduling problem. 7.2 FUTURE SCOPE OF THE WORK The multiprocessor scheduling problem or the Task Assignment Problem is an NP-hard problem. In this thesis work, the simulations are conducted in a Java environment with the benchmark datasets. Even though the simulations are carried out with bench mark datasets, the multiprocessor scheduling problem using the PSO approaches need testing in experiments and industrial practices. The nature of the tasks considered in this thesis is non-preemptive in nature. Work can be tried out for preemptive tasks. In the PSO-VI method more study can be carried out to find the optimal values of the parameters of the PSO equations to suit the applications. Further work can also be done for finding out the mathematical proof for the social factor and the cognitive factor in the basic PSO equation. The PSO with global topology is used for solving the multiprocessor scheduling problem. Other topologies mentioned in the literature can be tried out for solving the Task Assignment Problem. For the Parallel PSO method, the number of processors chosen is on a trial basis. So works can be tried out to find the optimal number of processors needed to solve a particular problem in a parallel environment in a real time scenario.

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY 2001 41 Brief Papers An Orthogonal Genetic Algorithm with Quantization for Global Numerical Optimization Yiu-Wing Leung, Senior Member,

More information

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops 1 Srinivas P. S., 2 Ramachandra Raju V., 3 C.S.P Rao. 1 Associate Professor, V. R. Sdhartha Engineering College, Vijayawada 2 Professor,

More information

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization adfa, p. 1, 2011. Springer-Verlag Berlin Heidelberg 2011 Devang Agarwal and Deepak Sharma Department of Mechanical

More information

Modified Particle Swarm Optimization

Modified Particle Swarm Optimization Modified Particle Swarm Optimization Swati Agrawal 1, R.P. Shimpi 2 1 Aerospace Engineering Department, IIT Bombay, Mumbai, India, swati.agrawal@iitb.ac.in 2 Aerospace Engineering Department, IIT Bombay,

More information

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Richa Agnihotri #1, Dr. Shikha Agrawal #1, Dr. Rajeev Pandey #1 # Department of Computer Science Engineering, UIT,

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem Available online at www.sciencedirect.com Procedia Engineering 15 (2011) 3673 3677 Advanced in Control Engineeringand Information Science Application of Improved Discrete Particle Swarm Optimization in

More information

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving

More information

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization J.Venkatesh 1, B.Chiranjeevulu 2 1 PG Student, Dept. of ECE, Viswanadha Institute of Technology And Management,

More information

Algorithm Design (4) Metaheuristics

Algorithm Design (4) Metaheuristics Algorithm Design (4) Metaheuristics Takashi Chikayama School of Engineering The University of Tokyo Formalization of Constraint Optimization Minimize (or maximize) the objective function f(x 0,, x n )

More information

Effectual Multiprocessor Scheduling Based on Stochastic Optimization Technique

Effectual Multiprocessor Scheduling Based on Stochastic Optimization Technique Effectual Multiprocessor Scheduling Based on Stochastic Optimization Technique A.Gowthaman 1.Nithiyanandham 2 G Student [VLSI], Dept. of ECE, Sathyamabama University,Chennai, Tamil Nadu, India 1 G Student

More information

Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm

Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm Shinn-Ying Ho *, Chia-Cheng Liu, Soundy Liu, and Jun-Wen Jou Department of Information Engineering, Feng Chia University,

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

GRID SCHEDULING USING ENHANCED PSO ALGORITHM

GRID SCHEDULING USING ENHANCED PSO ALGORITHM GRID SCHEDULING USING ENHANCED PSO ALGORITHM Mr. P.Mathiyalagan 1 U.R.Dhepthie 2 Dr. S.N.Sivanandam 3 1 Lecturer 2 Post Graduate Student 3 Professor and Head Department of Computer Science and Engineering

More information

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ OUTLINE Optimisation problems Optimisation & search Two Examples The knapsack problem

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization

Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization Lana Dalawr Jalal Abstract This paper addresses the problem of offline path planning for

More information

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation: Convergence of PSO The velocity update equation: v i = v i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) for some values of φ 1 and φ 2 the velocity grows without bound can bound velocity to range [ V max,v

More information

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm A. Lari, A. Khosravi and A. Alfi Faculty of Electrical and Computer Engineering, Noushirvani

More information

Index Terms PSO, parallel computing, clustering, multiprocessor.

Index Terms PSO, parallel computing, clustering, multiprocessor. Parallel Particle Swarm Optimization in Data Clustering Yasin ORTAKCI Karabuk University, Computer Engineering Department, Karabuk, Turkey yasinortakci@karabuk.edu.tr Abstract Particle Swarm Optimization

More information

A Clustering Multi-objective Evolutionary Algorithm Based on Orthogonal and Uniform Design

A Clustering Multi-objective Evolutionary Algorithm Based on Orthogonal and Uniform Design A Clustering Multi-objective Evolutionary Algorithm Based on Orthogonal and Uniform Design Yuping Wang, Chuangyin Dang, Hecheng Li, Lixia Han and Jingxuan Wei Abstract Designing efficient algorithms for

More information

PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING

PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING YASIN ORTAKCI Karabuk University, Computer Engineering Department, Karabuk, Turkey E-mail: yasinortakci@karabuk.edu.tr Abstract Particle Swarm Optimization

More information

A Modified PSO Technique for the Coordination Problem in Presence of DG

A Modified PSO Technique for the Coordination Problem in Presence of DG A Modified PSO Technique for the Coordination Problem in Presence of DG M. El-Saadawi A. Hassan M. Saeed Dept. of Electrical Engineering, Faculty of Engineering, Mansoura University, Egypt saadawi1@gmail.com-

More information

A Genetic Algorithm for Multiprocessor Task Scheduling

A Genetic Algorithm for Multiprocessor Task Scheduling A Genetic Algorithm for Multiprocessor Task Scheduling Tashniba Kaiser, Olawale Jegede, Ken Ferens, Douglas Buchanan Dept. of Electrical and Computer Engineering, University of Manitoba, Winnipeg, MB,

More information

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Nasser Sadati Abstract Particle Swarm Optimization (PSO) algorithms recently invented as intelligent optimizers with several highly

More information

Grid Scheduling using PSO with Naive Crossover

Grid Scheduling using PSO with Naive Crossover Grid Scheduling using PSO with Naive Crossover Vikas Singh ABV- Indian Institute of Information Technology and Management, GwaliorMorena Link Road, Gwalior, India Deepak Singh Raipur Institute of Technology

More information

A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization

A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization Dr. Liu Dasheng James Cook University, Singapore / 48 Outline of Talk. Particle Swam Optimization 2. Multiobjective Particle Swarm

More information

Chapter 14 Global Search Algorithms

Chapter 14 Global Search Algorithms Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.

More information

Particle Swarm Optimization Based Approach for Location Area Planning in Cellular Networks

Particle Swarm Optimization Based Approach for Location Area Planning in Cellular Networks International Journal of Intelligent Systems and Applications in Engineering Advanced Technology and Science ISSN:2147-67992147-6799 www.atscience.org/ijisae Original Research Paper Particle Swarm Optimization

More information

ABSTRACT I. INTRODUCTION. J Kanimozhi *, R Subramanian Department of Computer Science, Pondicherry University, Puducherry, Tamil Nadu, India

ABSTRACT I. INTRODUCTION. J Kanimozhi *, R Subramanian Department of Computer Science, Pondicherry University, Puducherry, Tamil Nadu, India ABSTRACT 2018 IJSRSET Volume 4 Issue 4 Print ISSN: 2395-1990 Online ISSN : 2394-4099 Themed Section : Engineering and Technology Travelling Salesman Problem Solved using Genetic Algorithm Combined Data

More information

Particle Swarm Optimization applied to Pattern Recognition

Particle Swarm Optimization applied to Pattern Recognition Particle Swarm Optimization applied to Pattern Recognition by Abel Mengistu Advisor: Dr. Raheel Ahmad CS Senior Research 2011 Manchester College May, 2011-1 - Table of Contents Introduction... - 3 - Objectives...

More information

Exploration vs. Exploitation in Differential Evolution

Exploration vs. Exploitation in Differential Evolution Exploration vs. Exploitation in Differential Evolution Ângela A. R. Sá 1, Adriano O. Andrade 1, Alcimar B. Soares 1 and Slawomir J. Nasuto 2 Abstract. Differential Evolution (DE) is a tool for efficient

More information

PARTICLE swarm optimization (PSO) is one of the evolutionary

PARTICLE swarm optimization (PSO) is one of the evolutionary 288 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART A: SYSTEMS AND HUMANS, VOL. 38, NO. 2, MARCH 2008 OPSO: Orthogonal Particle Swarm Optimization and Its Application to Task Assignment Problems

More information

Study on GA-based matching method of railway vehicle wheels

Study on GA-based matching method of railway vehicle wheels Available online www.jocpr.com Journal of Chemical and Pharmaceutical Research, 2014, 6(4):536-542 Research Article ISSN : 0975-7384 CODEN(USA) : JCPRC5 Study on GA-based matching method of railway vehicle

More information

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007)

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007) In the name of God Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm Spring 2009 Instructor: Dr. Masoud Yaghini Outlines Problem Definition Modeling As A Set Partitioning

More information

Design of Nearest Neighbor Classifiers Using an Intelligent Multi-objective Evolutionary Algorithm

Design of Nearest Neighbor Classifiers Using an Intelligent Multi-objective Evolutionary Algorithm Design of Nearest Neighbor Classifiers Using an Intelligent Multi-objective Evolutionary Algorithm Jian-Hung Chen, Hung-Ming Chen, and Shinn-Ying Ho Department of Information Engineering and Computer Science,

More information

A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen 2, b

A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen 2, b International Conference on Information Technology and Management Innovation (ICITMI 2015) A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen

More information

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi

More information

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE Fang Wang, and Yuhui Qiu Intelligent Software and Software Engineering Laboratory, Southwest-China Normal University,

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION Kamil Zakwan Mohd Azmi, Zuwairie Ibrahim and Dwi Pebrianti Faculty of Electrical

More information

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Search & Optimization Search and Optimization method deals with

More information

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications

More information

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution

More information

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Dervis Karaboga and Bahriye Basturk Erciyes University, Engineering Faculty, The Department of Computer

More information

Dynamic Economic Dispatch for Power Generation Using Hybrid optimization Algorithm

Dynamic Economic Dispatch for Power Generation Using Hybrid optimization Algorithm Dynamic Economic Dispatch for Power Generation Using Hybrid optimization Algorithm G.Karthika 1, Mr.M.Vigneshwaran, M.E., 2 PG Scholar, M. Kumarasamy College of Engineering, Karur, Tamilnadu, India 1 Assistant

More information

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization Cell-to-switch assignment in cellular networks using barebones particle swarm optimization Sotirios K. Goudos a), Konstantinos B. Baltzis, Christos Bachtsevanidis, and John N. Sahalos RadioCommunications

More information

Evolutionary Methods for State-based Testing

Evolutionary Methods for State-based Testing Evolutionary Methods for State-based Testing PhD Student Raluca Lefticaru Supervised by Florentin Ipate University of Piteşti, Romania Department of Computer Science Outline Motivation Search-based software

More information

Part II. Computational Intelligence Algorithms

Part II. Computational Intelligence Algorithms Part II Computational Intelligence Algorithms 126 Chapter 5 Population-based Single-objective Algorithms One bee makes no swarm. French proverb This chapter provides an overview of two CI algorithms that

More information

Using Genetic Algorithms to optimize ACS-TSP

Using Genetic Algorithms to optimize ACS-TSP Using Genetic Algorithms to optimize ACS-TSP Marcin L. Pilat and Tony White School of Computer Science, Carleton University, 1125 Colonel By Drive, Ottawa, ON, K1S 5B6, Canada {mpilat,arpwhite}@scs.carleton.ca

More information

APPLICATION OF BPSO IN FLEXIBLE MANUFACTURING SYSTEM SCHEDULING

APPLICATION OF BPSO IN FLEXIBLE MANUFACTURING SYSTEM SCHEDULING International Journal of Mechanical Engineering and Technology (IJMET) Volume 8, Issue 5, May 2017, pp. 186 195, Article ID: IJMET_08_05_020 Available online at http://www.ia aeme.com/ijmet/issues.asp?jtype=ijmet&vtyp

More information

Using CODEQ to Train Feed-forward Neural Networks

Using CODEQ to Train Feed-forward Neural Networks Using CODEQ to Train Feed-forward Neural Networks Mahamed G. H. Omran 1 and Faisal al-adwani 2 1 Department of Computer Science, Gulf University for Science and Technology, Kuwait, Kuwait omran.m@gust.edu.kw

More information

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Pre-requisite Material for Course Heuristics and Approximation Algorithms Pre-requisite Material for Course Heuristics and Approximation Algorithms This document contains an overview of the basic concepts that are needed in preparation to participate in the course. In addition,

More information

Binary Differential Evolution Strategies

Binary Differential Evolution Strategies Binary Differential Evolution Strategies A.P. Engelbrecht, Member, IEEE G. Pampará Abstract Differential evolution has shown to be a very powerful, yet simple, population-based optimization approach. The

More information

Optimal Reactive Power Dispatch Using Hybrid Loop-Genetic Based Algorithm

Optimal Reactive Power Dispatch Using Hybrid Loop-Genetic Based Algorithm Optimal Reactive Power Dispatch Using Hybrid Loop-Genetic Based Algorithm Md Sajjad Alam Student Department of Electrical Engineering National Institute of Technology, Patna Patna-800005, Bihar, India

More information

Solving Sudoku Puzzles with Node Based Coincidence Algorithm

Solving Sudoku Puzzles with Node Based Coincidence Algorithm Solving Sudoku Puzzles with Node Based Coincidence Algorithm Kiatsopon Waiyapara Department of Compute Engineering, Faculty of Engineering, Chulalongkorn University, Bangkok, Thailand kiatsopon.w@gmail.com

More information

BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines

BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines International Journal of Mathematics and Computer Applications Research (IJMCAR) ISSN 2249-6955 Vol. 2 Issue 4 Dec - 2012 25-32 TJPRC Pvt. Ltd., BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP

More information

An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm

An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm Journal of Universal Computer Science, vol. 13, no. 10 (2007), 1449-1461 submitted: 12/6/06, accepted: 24/10/06, appeared: 28/10/07 J.UCS An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm

More information

A Taguchi Approach to Parameter Setting in a Genetic Algorithm for General Job Shop Scheduling Problem

A Taguchi Approach to Parameter Setting in a Genetic Algorithm for General Job Shop Scheduling Problem IEMS Vol. 6, No., pp. 9-4, December 007. A Taguchi Approach to Parameter Setting in a Genetic Algorithm for General Job Shop Scheduling Problem Ji Ung Sun School of Industrial & Managment Engineering Hankuk

More information

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization M. Shahab Alam, M. Usman Rafique, and M. Umer Khan Abstract Motion planning is a key element of robotics since it empowers

More information

A Late Acceptance Hill-Climbing algorithm the winner of the International Optimisation Competition

A Late Acceptance Hill-Climbing algorithm the winner of the International Optimisation Competition The University of Nottingham, Nottingham, United Kingdom A Late Acceptance Hill-Climbing algorithm the winner of the International Optimisation Competition Yuri Bykov 16 February 2012 ASAP group research

More information

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach 1 Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach David Greiner, Gustavo Montero, Gabriel Winter Institute of Intelligent Systems and Numerical Applications in Engineering (IUSIANI)

More information

SWITCHES ALLOCATION IN DISTRIBUTION NETWORK USING PARTICLE SWARM OPTIMIZATION BASED ON FUZZY EXPERT SYSTEM

SWITCHES ALLOCATION IN DISTRIBUTION NETWORK USING PARTICLE SWARM OPTIMIZATION BASED ON FUZZY EXPERT SYSTEM SWITCHES ALLOCATION IN DISTRIBUTION NETWORK USING PARTICLE SWARM OPTIMIZATION BASED ON FUZZY EXPERT SYSTEM Tiago Alencar UFMA tiagoalen@gmail.com Anselmo Rodrigues UFMA schaum.nyquist@gmail.com Maria da

More information

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China

More information

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION Manjeet Singh 1, Divesh Thareja 2 1 Department of Electrical and Electronics Engineering, Assistant Professor, HCTM Technical

More information

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat An Improved Firefly Algorithm for Optimization Problems Amarita Ritthipakdee 1, Arit Thammano, Nol Premasathian 3, and Bunyarit Uyyanonvara 4 Abstract Optimization problem is one of the most difficult

More information

Speculative Evaluation in Particle Swarm Optimization

Speculative Evaluation in Particle Swarm Optimization Speculative Evaluation in Particle Swarm Optimization Matthew Gardner, Andrew McNabb, and Kevin Seppi Department of Computer Science, Brigham Young University Abstract. Particle swarm optimization (PSO)

More information

5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing. 6. Meta-heuristic Algorithms and Rectangular Packing

5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing. 6. Meta-heuristic Algorithms and Rectangular Packing 1. Introduction 2. Cutting and Packing Problems 3. Optimisation Techniques 4. Automated Packing Techniques 5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing 6.

More information

A Survey of Solving Approaches for Multiple Objective Flexible Job Shop Scheduling Problems

A Survey of Solving Approaches for Multiple Objective Flexible Job Shop Scheduling Problems BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 15, No 2 Sofia 2015 Print ISSN: 1311-9702; Online ISSN: 1314-4081 DOI: 10.1515/cait-2015-0025 A Survey of Solving Approaches

More information

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Australian Journal of Basic and Applied Sciences, 4(8): 3366-3382, 21 ISSN 1991-8178 Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Akbar H. Borzabadi,

More information

An Optimization of Association Rule Mining Algorithm using Weighted Quantum behaved PSO

An Optimization of Association Rule Mining Algorithm using Weighted Quantum behaved PSO An Optimization of Association Rule Mining Algorithm using Weighted Quantum behaved PSO S.Deepa 1, M. Kalimuthu 2 1 PG Student, Department of Information Technology 2 Associate Professor, Department of

More information

An Island Based Hybrid Evolutionary Algorithm for Optimization

An Island Based Hybrid Evolutionary Algorithm for Optimization An Island Based Hybrid Evolutionary Algorithm for Optimization Changhe Li and Shengxiang Yang Department of Computer Science, University of Leicester University Road, Leicester LE1 7RH, UK {cl160,s.yang}@mcs.le.ac.uk

More information

TASK SCHEDULING USING HAMMING PARTICLE SWARM OPTIMIZATION IN DISTRIBUTED SYSTEMS

TASK SCHEDULING USING HAMMING PARTICLE SWARM OPTIMIZATION IN DISTRIBUTED SYSTEMS Computing and Informatics, Vol. 36, 2017, 950 970, doi: 10.4149/cai 2017 4 950 TASK SCHEDULING USING HAMMING PARTICLE SWARM OPTIMIZATION IN DISTRIBUTED SYSTEMS Subramaniam Sarathambekai, Kandaswamy Umamaheswari

More information

SwarmOps for Java. Numeric & Heuristic Optimization Source-Code Library for Java The Manual Revision 1.0

SwarmOps for Java. Numeric & Heuristic Optimization Source-Code Library for Java The Manual Revision 1.0 Numeric & Heuristic Optimization Source-Code Library for Java The Manual Revision 1.0 By Magnus Erik Hvass Pedersen June 2011 Copyright 2009-2011, all rights reserved by the author. Please see page 4 for

More information

Center-Based Sampling for Population-Based Algorithms

Center-Based Sampling for Population-Based Algorithms Center-Based Sampling for Population-Based Algorithms Shahryar Rahnamayan, Member, IEEE, G.GaryWang Abstract Population-based algorithms, such as Differential Evolution (DE), Particle Swarm Optimization

More information

Tabu search and genetic algorithms: a comparative study between pure and hybrid agents in an A-teams approach

Tabu search and genetic algorithms: a comparative study between pure and hybrid agents in an A-teams approach Tabu search and genetic algorithms: a comparative study between pure and hybrid agents in an A-teams approach Carlos A. S. Passos (CenPRA) carlos.passos@cenpra.gov.br Daniel M. Aquino (UNICAMP, PIBIC/CNPq)

More information

CHAPTER 4 AN OPTIMIZED K-MEANS CLUSTERING TECHNIQUE USING BAT ALGORITHM

CHAPTER 4 AN OPTIMIZED K-MEANS CLUSTERING TECHNIQUE USING BAT ALGORITHM 63 CHAPTER 4 AN OPTIMIZED K-MEANS CLUSTERING TECHNIQUE USING BAT ALGORITHM This chapter introduces the new algorithm K-Means and Bat Algorithm (KMBA), for identifying the initial centroid of each cluster.

More information

Modified Order Crossover (OX) Operator

Modified Order Crossover (OX) Operator Modified Order Crossover (OX) Operator Ms. Monica Sehrawat 1 N.C. College of Engineering, Israna Panipat, Haryana, INDIA. Mr. Sukhvir Singh 2 N.C. College of Engineering, Israna Panipat, Haryana, INDIA.

More information

IMPLEMENTATION OF HYBRID EVOLUTIONARY ALGORITHM FOR BDD OPTIMIZATION BY FINDING OPTIMUM VARIABLE ORDERING

IMPLEMENTATION OF HYBRID EVOLUTIONARY ALGORITHM FOR BDD OPTIMIZATION BY FINDING OPTIMUM VARIABLE ORDERING IMPLEMENTATION OF HYBRID EVOLUTIONARY ALGORITHM FOR BDD OPTIMIZATION BY FINDING OPTIMUM VARIABLE ORDERING Submitted in Partial Fulfilment of the Requirements for the award of the degree of MASTER OF TECHNOLOGY

More information

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems Hongbo Liu 1,2,AjithAbraham 3,1, Okkyung Choi 3,4, and Seong Hwan Moon 4 1 School of Computer

More information

Job Scheduling on Computational Grids Using Fuzzy Particle Swarm Algorithm

Job Scheduling on Computational Grids Using Fuzzy Particle Swarm Algorithm Job Scheduling on Computational Grids Using Fuzzy Particle Swarm Algorithm Ajith Abraham 1,3, Hongbo Liu 2, and Weishi Zhang 3 1 School of Computer Science and Engineering, Chung-Ang University, Seoul,

More information

Optimized Algorithm for Particle Swarm Optimization

Optimized Algorithm for Particle Swarm Optimization Optimized Algorithm for Particle Swarm Optimization Fuzhang Zhao Abstract Particle swarm optimization (PSO) is becoming one of the most important swarm intelligent paradigms for solving global optimization

More information

Escaping Local Optima: Genetic Algorithm

Escaping Local Optima: Genetic Algorithm Artificial Intelligence Escaping Local Optima: Genetic Algorithm Dae-Won Kim School of Computer Science & Engineering Chung-Ang University We re trying to escape local optima To achieve this, we have learned

More information

PARTICLE Swarm Optimization (PSO), an algorithm by

PARTICLE Swarm Optimization (PSO), an algorithm by , March 12-14, 2014, Hong Kong Cluster-based Particle Swarm Algorithm for Solving the Mastermind Problem Dan Partynski Abstract In this paper we present a metaheuristic algorithm that is inspired by Particle

More information

Particle Swarm Optimization to Solve Optimization Problems

Particle Swarm Optimization to Solve Optimization Problems Particle Swarm Optimization to Solve Optimization Problems Gregorio Toscano-Pulido and Carlos A. Coello Coello Evolutionary Computation Group at CINVESTAV-IPN (EVOCINV) Electrical Eng. Department, Computer

More information

Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization

Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization 488 International Journal Wu-Chang of Control, Wu Automation, and Men-Shen and Systems, Tsai vol. 6, no. 4, pp. 488-494, August 2008 Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization

More information

Modified K-Means Algorithm for Genetic Clustering

Modified K-Means Algorithm for Genetic Clustering 24 Modified K-Means Algorithm for Genetic Clustering Mohammad Babrdel Bonab Islamic Azad University Bonab Branch, Iran Summary The K-Means Clustering Approach is one of main algorithms in the literature

More information

Comparison of two variants of particle swarm optimization algorithm for solving flexible job shop scheduling problem

Comparison of two variants of particle swarm optimization algorithm for solving flexible job shop scheduling problem International Conference on Recent Advances in Computer Systems (RACS 05) Comparison of two variants of particle swarm optimization algorithm for solving flexible job shop scheduling problem S.Kamel/ENIT

More information

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India. Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Training Artificial

More information

Experiments on Cryptanalysing Block Ciphers via Evolutionary Computation Paradigms

Experiments on Cryptanalysing Block Ciphers via Evolutionary Computation Paradigms Experiments on Cryptanalysing Block Ciphers via Evolutionary Computation Paradigms Nalini N Department of Computer Science and Engineering, Siddaganga Institute of Technology, Tumkur 572103, Karnataka,

More information

CHAPTER 5 STRUCTURAL OPTIMIZATION OF SWITCHED RELUCTANCE MACHINE

CHAPTER 5 STRUCTURAL OPTIMIZATION OF SWITCHED RELUCTANCE MACHINE 89 CHAPTER 5 STRUCTURAL OPTIMIZATION OF SWITCHED RELUCTANCE MACHINE 5.1 INTRODUCTION Nowadays a great attention has been devoted in the literature towards the main components of electric and hybrid electric

More information

Non-deterministic Search techniques. Emma Hart

Non-deterministic Search techniques. Emma Hart Non-deterministic Search techniques Emma Hart Why do local search? Many real problems are too hard to solve with exact (deterministic) techniques Modern, non-deterministic techniques offer ways of getting

More information

Extending MATLAB and GA to Solve Job Shop Manufacturing Scheduling Problems

Extending MATLAB and GA to Solve Job Shop Manufacturing Scheduling Problems Extending MATLAB and GA to Solve Job Shop Manufacturing Scheduling Problems Hamidullah Khan Niazi 1, Sun Hou-Fang 2, Zhang Fa-Ping 3, Riaz Ahmed 4 ( 1, 4 National University of Sciences and Technology

More information

Orthogonal Particle Swarm Optimization Algorithm and Its Application in Circuit Design

Orthogonal Particle Swarm Optimization Algorithm and Its Application in Circuit Design TELKOMNIKA, Vol. 11, No. 6, June 2013, pp. 2926 ~ 2932 e-issn: 2087-278X 2926 Orthogonal Particle Swarm Optimization Algorithm and Its Application in Circuit Design Xuesong Yan* 1, Qinghua Wu 2,3, Hammin

More information

Novel Initialisation and Updating Mechanisms in PSO for Feature Selection in Classification

Novel Initialisation and Updating Mechanisms in PSO for Feature Selection in Classification Novel Initialisation and Updating Mechanisms in PSO for Feature Selection in Classification Bing Xue, Mengjie Zhang, and Will N. Browne School of Engineering and Computer Science Victoria University of

More information

Hybrid Differential Evolution Algorithm for Traveling Salesman Problem

Hybrid Differential Evolution Algorithm for Traveling Salesman Problem Available online at www.sciencedirect.com Procedia Engineering 15 (2011) 2716 2720 Advanced in Control Engineeringand Information Science Hybrid Differential Evolution Algorithm for Traveling Salesman

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Informed Search and Exploration Chapter 4 (4.3 4.6) Searching: So Far We ve discussed how to build goal-based and utility-based agents that search to solve problems We ve also presented

More information

Sci.Int.(Lahore),28(1), ,2016 ISSN ; CODEN: SINTE 8 201

Sci.Int.(Lahore),28(1), ,2016 ISSN ; CODEN: SINTE 8 201 Sci.Int.(Lahore),28(1),201-209,2016 ISSN 1013-5316; CODEN: SINTE 8 201 A NOVEL PLANT PROPAGATION ALGORITHM: MODIFICATIONS AND IMPLEMENTATION Muhammad Sulaiman 1, Abdel Salhi 2, Eric S Fraga 3, Wali Khan

More information

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2 15-382 COLLECTIVE INTELLIGENCE - S18 LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2 INSTRUCTOR: GIANNI A. DI CARO BACKGROUND: REYNOLDS BOIDS Reynolds created a model of coordinated animal

More information

Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems

Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems Tony Maciejewski, Kyle Tarplee, Ryan Friese, and Howard Jay Siegel Department of Electrical and Computer Engineering Colorado

More information