A Hybrid Genetic Algorithm for the Minimum Linear Arrangement Problem

Size: px
Start display at page:

Download "A Hybrid Genetic Algorithm for the Minimum Linear Arrangement Problem"

Transcription

1 The Pennsylvania State University The Graduate School Capital College A Hybrid Genetic Algorithm for the Minimum Linear Arrangement Problem A Master s Paper in Computer Science by Paul Eppley c 2001 Paul Eppley Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science November 2001

2 Abstract This paper presents a hybrid genetic algorithm for the minimum linear arrangement problem. The algorithm was tested on a collection of 22 sparse graphs that has been used to test twelve other algorithms for the minimum linear arrangement problem. The graphs have up to 10,240 vertices and up to 49,820 edges. On about 2/3 of the graphs, our algorithm found solutions superior to all other algorithms except for simulated annealing. For one graph, though, our algorithm found a solution better than any of the other algorithms. While our algorithm was generally slower than all other algorithms, for extremely large graphs it was faster than simulated annealing. i

3 Table of Contents Abstract Acknowledgement List of Figures List of Tables i iii iv v 1 Introduction 1 2 Genetic Algorithms 3 3 The Minimum Linear Arrangement Genetic Algorithm Problem Encoding Fitness Initial Population Parent Selection Crossover Mutation Local Optimization Replacement Stopping Condition Experimental Results Test Graphs Other Algorithms Tested Comparison of MINLAGA to Other Algorithms Conclusion 23 References 23 ii

4 Acknowledgement I wish to thank Dr. Thang N. Bui for his guidance on this paper. His advice and constructive criticism aided in the success of this research and in the clarity of its presentation here. I also wish to thank committee members Dr. S. Kingan, Dr. P. Naumov, and Dr. L. Null for reviewing this paper. iii

5 List of Figures 1 An Undirected Graph with 8 Vertices and 11 Edges A Minimum Linear Arrangement with Total Edge Length of A Linear Arrangment with Total Edge Length of Outline of the Minimum Linear Arrangement Genetic Algorithm Point Crossover iv

6 List of Tables 1 Theoretical Lower Bounds Comparison* Parameter values Test Graphs* MINLAGA vs. Simulated Annealing Results from Other Algorithms Results from Other Algorithms v

7 1 INTRODUCTION 1 1 Introduction Let G = (V, E) be an undirected graph. Informally, the minimum linear arrangement problem seeks to arrange the vertices of G at equidistant points along a line in a way that minimizes the sum of the lengths of the edges. More formally, the minimum linear arrangement problem (MINLA) is the problem of finding a one-to-one function π : V {1,..., V } that minimizes the sum Γ(π) = π(u) π(v). (u,v) E See Figures 1 through 3 for examples of good and bad linear arrangements. MINLA has several real-world applications. In VLSI design, solutions to MINLA can be used to optimize the layout of components on a board [2][6]. In graph drawing, approximate solutions to MINLA can be used to produce bipartite drawings with a small number of edge crossings for a large class of bipartite graphs [11]. MINLA also has applications in job scheduling and modeling neural activity in the brain [1][10][14]. MINLA is N P-hard, so it is unlikely that a polynomial time algorithm will ever be discovered that can produce exact solutions for general graphs. However, there do exist polynomial time algorithms that produce exact solutions for special classes of graphs such as trees, hypercubes, order-4 DeBruijn graphs, square and rectangular meshes, and d-dimensional k-ary cliques [13]. There also exist efficient algorithms that can produce approximate solutions to within a constant factor for dense graphs [5]. However, no efficient (i.e., polynomial-time) algorithm yet exists that can approximate the best solution to within a constant factor [13]. It is therefore desirable in practice to have heuristic algorithms that produce high quality solutions as quickly as possible. For comparison purposes, Petit summarized solution quality and running time for twelve such algorithms in [13]. He also summarized various theoretical methods of calculating lower bounds for MINLA in order to evaluate the solution quality produced by the various algorithms. Unfortunately, the theoretical methods prove relatively useless for evaluation purposes since they

8 1 INTRODUCTION 2 Figure 1: An Undirected Graph with 8 Vertices and 11 Edges Figure 2: A Minimum Linear Arrangement with Total Edge Length of 20 Figure 3: A Linear Arrangment with Total Edge Length of 37

9 2 GENETIC ALGORITHMS 3 nearly always produce lower bounds that are far below any known optimal solutions or the best results obtained by any known algorithm. For specifics, see Table 1. In the table, E-M, and D-M refer to the Edge Method and Degree Method, respectively, and both are due to Petit. P-M and J-M refer to the Path Method and the Juvan-Mohar Method, respectively, and both are due to Juvan and Mohar [8]. G-H refers to the Gomory-Hu Tree Method due to Adolphson and Hu [2]. M-M refers to the Mesh Method, which is also due to Petit. In the final column of the table, the best known solution for each graph is listed. Clearly, none of the theoretical methods is of much use in evaluating the best known solutions since they are so far below those solutions. This paper presents a genetic algorithm for the minimum linear arrangement problem (MINLAGA). We tested our algorithm on the same collection of sparse graphs used by Petit. In most cases, MINLAGA found solutions superior to all other algorithms except simulated annealing. On one graph, however, MINLAGA consistently found solutions superior to all tested algorithms, including simulated annealing. The running time of MINLAGA was comparable to simulated annealing both generally took much longer than all other algorithms. The extra running time was justified, though, by superior solutions for most graphs. In several cases, especially for extremely large graphs, MINLAGA ran in significantly less time than simulated annealing. The rest of the paper is organized as follows. Section 2 discusses genetic algorithms in general. Section 3 presents the design of our genetic algorithm for MINLA. Section 4 summarizes the experimental results. Concluding remarks make up Section 5. 2 Genetic Algorithms A genetic algorithm (GA) is a heuristic algorithm that models the process of evolution in nature [7]. The idea of natural selection survival of the fittest is at the heart of genetic algorithms. In a GA, simulated organisms represent potential solutions to a problem. In each generation some of these organisms

10 2 GENETIC ALGORITHMS 4 Table 1: Theoretical Lower Bounds Comparison* Graph Name E-M D-M P-M J-M G-H M-M Best randoma randoma randoma randoma randomg c1y c2y c3y c4y c5y gd95c gd96a gd96b gd96c gd96d elt airfoil crack whitaker hc mesh33x bintree *From [13] M-M could not be applied. Optimal

11 2 GENETIC ALGORITHMS 5 are selected to reproduce or possibly die based on their relative fitness. The hope is that, just as in nature, after many generations superior organisms representing superior problem solutions will emerge. One difference between nature and GAs, however, is that while in nature time is of little concern, a GA that takes millions of years to run is of little practical use. To speed up the evolutionary process, then, a GA might alter some of the parameters corresponding to phenomena occurring in nature or even add extra steps to enhance natural processes. The simulated organisms in a GA s population are typically referred to as chromosomes since they are usually collections of simple structures called genes. The way in which the chromosomes and their genes represent potential problem solutions is called an encoding. In graph theoretic problems, typical encoding schemes include 0-1 vectors and permutations of the vertex set. A 0-1 vector of length V would be a natural encoding for problems that call for solutions that are subsets of the vertex set, such as the Maximum Clique problem. In this encoding scheme, a 1 could represent the inclusion of a particular vertex in the solution, and a 0 could represent exclusion [3]. A permutation of the vertex set would be a natural encoding for problems like the Traveling Salesman problem, that call for such solutions. More complicated encoding schemes, though, can sometimes yield superior results [4]. The usual progression for a GA is as follows. First, chromosomes are created at random to fill a population. Then a number of generations pass, during which the population evolves members that are more and more fit. Eventually the population converges, the algorithm ends, and the solution represented by the fittest member of the population is returned. As previously mentioned, convergence needs to occur in a reasonable amount of time. However, if a population converges too quickly, the returned solution could be a local optimum that is far from the global optimum. Population convergence can be defined in several ways. One way is to measure the similarity of the members of the population. When the members all resemble each other to within a certain degree, the algorithm ends. Another way to define convergence is to count the number of generations in

12 2 GENETIC ALGORITHMS 6 which the population shows no improvement in the fitness of its members. When the population remains stagnant for a certain number of generations, the algorithm ends. During each generation, members of the population are selected for reproduction. Typically, fitter members of the population have a higher probability of being selected as parents. Offspring are then produced by performing crossover on the parent chromosomes. This involves taking a section of genes from each parent and copying them directly into the offspring. This closely resembles the way chromosomes of parents in nature are joined to produce the chromosomes of offspring. The specific crossover scheme used by a GA, however, depends heavily on the encoding scheme. For example, if the encoding scheme is a permutation, joining consecutive sections of genes from each parent might not result in a valid chromosome. In that case, some variation of the basic crossover scheme is implemented. Crossover is a fundamental part of any genetic algorithm and is a good example of how varying parameters can affect the balance between exploration and exploitation. Exploration refers to sampling as much of the search space as possible. Exploitation refers to making use of good solutions currently in the population to create better solutions. By chopping chromosomes into many small pieces and mixing them with pieces from other parents during crossover, a genetic algorithm would tend to sample more of the search space. More often, though, a genetic algorithm will mix and match a few long sections of chromosomes from each parent, in the hope that good offspring will result from modifying parents less drastically. Once the crossover phase is complete, offspring undergo mutation. This involves randomly altering the genes of a chromosome with some given probability. While in nature, the mutation probability is very small, a GA might use an unnaturally high mutation probability to escape from local optima and also to explore the solution space of the problem more fully. After crossover and mutation, a pure GA would consider the production of offspring complete. Some GAs, however, include a step in which offspring are artificially improved. The purpose of this local optimization is to accelerate

13 3 THE MINIMUM LINEAR ARRANGEMENT GENETIC ALGORITHM 7 the convergence of the population toward better solutions. Since this step has no analogy in nature, a GA having a local optimization step is referred to as a hybrid GA. When the production of offspring is finally considered complete, the process of incorporating them into the population begins. Generational GAs replace the entire population at each generation, while steady-state GAs replace only a few members of the population. In a steady-state GA a typical scheme is for an offspring to replace one of its parents if it is fitter than the parent. Since the offspring resembles the parent to some extent, replacing the parent maintains the population s diversity and avoids premature convergence. When an offspring replaces a parent, the parent is removed from the population. If the offspring is not fitter than either parent, it might try to replace the least fit member of the population. Though this might tend to decrease population diversity, discarding a superior offspring is generally undesirable. If every member of the population is fitter than the new offspring, the offspring does not survive and is discarded. Clearly, the fitness function, which measures the quality of the solution represented by each chromosome, is an important component of any GA. It influences parent selection, replacement, and population convergence. Depending on the problem the GA is trying to solve, measuring the fitness of a chromosome can sometimes be a time consuming task. To save running time, one technique is to have the fitness function approximate the quality of the solutions represented by the chromosomes. 3 The Minimum Linear Arrangement Genetic Algorithm In this section, we present the specific details of the Minimum Linear Arrangement Genetic Algorithm (MINLAGA). Our algorithm has several interesting features not normally found in GAs. For example, we include a local optimization step to help the reproduction phase produce superior offspring,

14 3 THE MINIMUM LINEAR ARRANGEMENT GENETIC ALGORITHM 8 1. Randomly generate an initial population. 2. repeat 3. repeat 4. Select two parents from population. 5. Create offspring from parents. 6. Perform crossover. 7. Mutate offspring. 8. Perform local optimization. 9. Replace population members with offspring, if appropriate. 10. until population stagnates. 11. Update adaptive parameters. 12. until all parameters finish updating. 13. Perform local optimization on best member of population. 14. Return locally optimized best member. Figure 4: Outline of the Minimum Linear Arrangement Genetic Algorithm which is why our algorithm is classified a hybrid GA. Also, in order to increase exploration of the search space in early generations of the algorithm and to speed convergence to a good solution in later generations, the fitness function, mutation rate, and local optimization are all adaptive. The subsections that follow discuss the details of our algorithm s problem encoding, fitness measure, generation of initial population, generational cycle, stopping condition, and final output. See Figure 4 for an outline of the algorithm. Note that the adaptive parameters are updated every time the population sees no improvement in fitness for s generations, where s is called the stagnancy. All parameters are summarized in Table Problem Encoding The minimum linear arrangement problem calls for permutations of the vertex set, so each chromosome will represent such a permutation in the most

15 3 THE MINIMUM LINEAR ARRANGEMENT GENETIC ALGORITHM 9 Table 2: Parameter values Parameter Value Description n 150 Population size s 150 Stagnancy c 2 Number of crossover points p 0 41% Initial mutation probability p 10% Change in mutation probability p T 0% Final mutation probability L 0 2 Initial local optimization strength L 3 Change in local optimization strength L T 13 Final local optimization strength f 0 40% Initial fitness percent f 12% Change in fitness percent f T 100% Final fitness percent natural way. Assuming that the vertices are labeled from 1 to V, each chromosome will be a linear vector containing a permutation of the integers from 1 to V with each gene representing a vertex. 3.2 Fitness Let π I be the permutation represented by chromosome I. In MINLAGA, the fitness of chromosome I is just the total edge length Γ(π I ) fitness(i) = Γ(π I ) = π I (u) π I (v). (u,v) E This can be a rather expensive calculation, though, since every edge needs to be examined to determine its contribution to the sum. Therefore, to save running time, we initially approximate these fitness values by using only 40% of the edges in the calculation above. This percentage increases as the algorithm proceeds. By the time the algorithm is in its final generations, the percentage is 100% and the fitness function is computed exactly as in the formula above. The specific edges for the sample in the fitness calculation are not chosen randomly. Instead, our algorithm examines the vertices in decreasing order

16 3 THE MINIMUM LINEAR ARRANGEMENT GENETIC ALGORITHM 10 of degree, using all adjacent edges until the desired percentage is reached. In this way we give more weight to vertices that tend to have more influence over the quality of the final solution. 3.3 Initial Population The initial population is generated by creating n = 150 random chromosomes. Our experiments showed that smaller populations tended to reach stagnancy sooner, thus yielding faster running times, but poorer solution quality. Larger populations had the opposite effect. A population size of 150 was found to yield the best balance of solution quality to running time. Once the initial population is created, their fitnesses are calculated. 3.4 Parent Selection At each generation, two parents are selected from the population using a standard roulette wheel method. In this method, each member has space on the roulette wheel proportional to its scaled fitness and, thus, a corresponding probability of being selected. We scale the fitness values for parent selection in order to prevent exceptionally good members of the population from having an unduely high probability of being selected, thus leading to lack of population diversity and premature convergence. A linear transformation is used to scale the fitness of each member as follows. Let the points (x 1,y 1 ) and (x 2,y 2 ) define a line whose x-values represent unscaled chromosome fitness and corresponding y-values represent scaled fitness. We set x 1 and y 1 equal to the average fitness in the population, thus the average member of the population has its scaled fitness equal to its unscaled fitness. We then set x 2 equal to the unscaled fitness of the fittest member of the population, and its corresponding scaled fitness y 2 equal to 2y 1. Thus the most fit member of the population never has more than twice the probability of being selected compared to an average member of the population. Depending on the range of fitness values in the population, this could assign a negative scaled fitness to the worst member of the population. If that is the case, we reassign

17 3 THE MINIMUM LINEAR ARRANGEMENT GENETIC ALGORITHM 11 (x 2,y 2 ) to give the worst member of the population a scaled fitness of zero. Since this reduces the slope of the line, the most fit member is still no more than twice as likely to be selected as an average member of the population. 3.5 Crossover Multipoint crossover is used throughout the algorithm. Experimental results showed that two crossover points tended to produce the best solutions. Crossover proceeds as follows. Let permutations π 1 and π 2 be represented by the two selected parents Π 1 and Π 2, each having genes numbered from 1 through V. Let Π o be an offspring. Each generation, our algorithm chooses at random two crossover points c 1 and c 2 on the chromosomes. From the first parent, genes Π 1 [1] through Π 1 [c 1 ] are copied directly to Π o [1] to Π o [c 1 ], and genes Π 1 [c 2 ] through Π 1 [ V ] are copied directly to Π o [c 2 ] through Π o [ V ]. This leaves the genes in the middle section of the offspring between the crossover points yet to fill. Since the offspring must be a valid permutation of the vertex set, the necessary remaining vertices are entered in the order in which they appear in the other parent, Π 2. Three other offspring are created in a similar manner by alternating the parents and the copied sections. See Figure 5 for a graphical representation. The patterned sections of offspring indicate direct copies from the corresponding parent; the solid sections indicate where genes are taken from the remaining parent in order to complete the permutation. 3.6 Mutation After the offspring are created, the genes of all four offspring undergo mutation based on the current mutation probability. Mutation consists of swapping the value of a gene with the value of some other randomly selected gene along the chromosome. At the beginning of the mutation process, approximately p% of the genes are selected at random. Each is then swapped with a random gene. Best results have been obtained by using an unusually high initial mutation probability of p = 41%. This high number was a result of

18 3 THE MINIMUM LINEAR ARRANGEMENT GENETIC ALGORITHM 12 Offspring Offspring Parent 1 Parent 2 Offspring Offspring (Π o) (Π 1) (Π 2) c 1 c 2 Figure 5: 2-Point Crossover unsatisfactory solution quality in early experiments using more typical mutation rates. The poor solution quality led to the suspicion that the search space wasn t being explored sufficiently and that the population was converging prematurely. Higher probabilities allow more exploration of the search space yielding significantly better results. Such a high mutation probability, however, tends to negate the theoretical benefit of a genetic algorithm s imitation of nature. To restore this property, each time the population stagnates and parameters are updated, the mutation probability is reduced by 10%. After the fourth stagnation, the mutation probability is down to a more typical 1%. Finally, after one more stagnation, the mutation probability drops to 0%, leaving further exploration of the search space to other parts of the algorithm in the final generations. 3.7 Local Optimization After all offspring undergo mutation, the most fit is selected for local optimization. Recall that the purpose of local optimization is to artificially speed up the evolutionary process and to help the population converge toward optimal solutions. Our local optimization consists of a number of linear sweeps

19 3 THE MINIMUM LINEAR ARRANGEMENT GENETIC ALGORITHM 13 through the chromosome, during which pairs of genes are swapped if the swap yields an improvement in the chromosome s fitness. On the first sweep, all pairs of genes separated by a distance of 1/10 the length of the chromosome are tested. In subsequent sweeps the gap between genes being tested grows until on the last sweep the gap is 85% of the chromosome s length. The strength of this local optimization scheme can be adjusted by altering the number of sweeps each chromosome undergoes. Naturally, there is a tradeoff between the strength of the local optimization and the running time. While stronger local optimization does produce significantly better solutions, it also can result in unacceptably high running times. Let the number of sweeps be L. We found that a good balance of solution quality to running time is obtained with the local optimization making L 0 = 2 sweeps in the initial generations, increasing by L = 3 sweeps each time the population stagnates. The algorithm is not allowed to stop until the local optimization strength reaches its final value of L T = 13 sweeps. 3.8 Replacement After the best of the four offspring undergoes local optimization, two are selected for possible inclusion in the population. Since we are using a steadystate genetic algorithm, the population size remains constant throughout. Therefore, if offspring are accepted into the population, they must replace current population members. Recall that two chromosomes have sections copied directly from each parent. The less fit of each pair is eliminated from further consideration. For each offspring that remains, it replaces its corresponding parent in the population if it is more fit than that parent. If not, it replaces the least fit member of the population if it is more fit. Otherwise, it is discarded. Once replacement is complete, the algorithm moves on to its next generation.

20 4 EXPERIMENTAL RESULTS Stopping Condition As previously mentioned, when the population sees no change in total fitness for s generations, we say the population has reached stagnancy. Again, we found a time/solution-quality trade off with respect to the stagnancy. Higher values of s yielded better quality solutions along with higher running times. For our experiments we used a stagnancy of s = 150 since it produced a good balance of solution quality to running time. Recall that the adaptive parameters are updated every time the population reaches stagnancy. At that time, the stagnancy counter is also reset. Once the parameters reach their final values and the population reaches stagnancy one last time, the algorithm ends, and the best member of the population is returned as the final solution after it undergoes one final burst of very strong local optimization. 4 Experimental Results We implemented our algorithm in C++ and compiled the code using gcc, version The program was then tested using an AMD K MHz processor with 256 MB of RAM under Linux. Most of these conditions are identical to those used by Petit for his experiments. He compiled his C++ code with version 2.95 of GCC. Also, since he used a 450 MHz processor, we have adjusted our running times by 10%. We ran our algorithm eight times on each graph, saving the best of the eight solutions produced. 4.1 Test Graphs We tested our algorithm on all 22 graphs used by Petit. The graphs can be grouped into five categories: random graphs, graphs from VLSI design, graphs from graph-drawing competitions, graphs from finite element discretizations, and graphs with known optima. Most of the graphs are sparse since more efficient algorithms exist for dense graphs. The graphs ranged in size from 62 to vertices, with most having 1000 or more. See Table 3

21 4 EXPERIMENTAL RESULTS 15 for detailed statistics on each graph. Table 3: Test Graphs* Graph Name Vertices Edges Avg. Degree Category randoma randoma randoma Random randoma randomg c1y c2y c3y VLSI Design c4y c5y gd95c gd96a gd96b Graph Drawing gd96c gd96d elt airfoil Finite Element crack Discretizations whitaker hc Graphs Having mesh33x Known Optimal bintree Solutions *From [13] 4.2 Other Algorithms Tested On the 22 graphs from Table 3, Petit ran 12 heuristic algorithms falling into five categories. Detailed statistical results are available for each algorithm at jpetit/minla/experiments. Also, see Table 5 and Table 6 for the best solutions produced by these algorithms. The first category consists of the random and normal alogrithms. The

22 4 EXPERIMENTAL RESULTS 16 random algorithm consists of generating a random permutation of the vertex set, and the normal algorithm consists of simply arranging the vertices in the order in which they are labeled. As one would expect, these algorithms are fast but produce generally poor results. The second category consists of greedy heuristic algorithms. In these algorithms, a single vertex is first placed in the middle of the arrangement. Then each remaining vertex is examined one at a time and placed either immediately to the left or to the right of all previously placed vertices, depending on which placement is better at the time. The particular algorithm being used determines the order in which the vertices are considered for placement. Random, breadth first search, and depth first search orderings are among those tested. In general, these algorithms are also very fast but do not yield particularly good results. The third category consists of hill-climbing heuristic algorithms. These algorithms start with a randomly generated permutation of the vertex set. Depending on the specific algorithm, two or three vertices are swapped. If the swap results in a better permutation, it is retained; otherwise, the previous permutation is restored. This is similar to the local optimization in MINLAGA. These algorithms proceed for a given number of iterations. Due to the number of iterations, these algorithms were slower than those discussed so far. On the random graphs hill-climbing usually produced solutions superior to those produced by the other algorithms previously discussed, but solutions were mixed on other classes of graphs. The final two categories each consists of a single algorithm spectral sequencing [8] and simulated annealing [9]. Spectral sequencing orders the vertices based on a specific eigenvector related to the graph. Simulated annealing is the well-known approximation heuristic similar to hill-climbing. The difference is that, with decreasing probability throughout the algorithm, at each iteration a potential solution may be kept even if it does not represent an improvement over the previous iteration. In all but four of the graphs, simulated annealing produced the best result. For three out of the four in which simulated annealing did not produce the best result, spectral sequenc-

23 4 EXPERIMENTAL RESULTS 17 ing came out on top. However, if simulated annealing were excluded from consideration, spectral sequencing would produce the best solution for only eight of the 22 graphs. The price to pay for simmulated annealing s superior performance is inferior running time. In every case, simulated annealing took much longer than all other algorithms tested, while spectral sequencing was well within the range of the others. 4.3 Comparison of MINLAGA to Other Algorithms In this section we summarize our comparison of MINLAGA to other algorithms with respect to solution quality and running time. We focus primarily on comparing MINLAGA to simulated annealing, since simulated annealing usually produced the highest quality solutions. See Table 4 for detailed statistics on the two algorithms performance on all graphs. In terms of running time, MINLAGA and simulated annealing were comparable. Both algorithms took much longer to run than any of the other tested algorithms. MINLAGA usually took longer than simulated annealing, especially on small graphs. On very large graphs, however, MINLAGA was faster. Since MINLAGA usually produced the second highest quality solutions usually right behind simulated annealing MINLAGA should be the algorithm of choice in situations when near optimal solutions are required for large graphs and running time is an issue. We now examine the results for each class of graphs in more detail.

24 Table 4: MINLAGA vs. Simulated Annealing Graph Solution Quality Avg. Solution Quality Total Running Time* in Seconds Name Vertices SA MINLAGA % Diff. SA MINLAGA % Diff. SA MINLAGA randoma % % randoma % % randoma % % randoma % % randomg % % c1y % % c2y % % c3y % % c4y % % c5y % % gd95c % % gd96a % % gd96b % % gd96c % % gd96d % % elt % % airfoil % % crack % % whitaker % % hc % % mesh33x % % bintree % % *SA was run 25 times per graph. MINLAGA was run 10 times per graph. SA was run only 5 times due to long running time. 4 EXPERIMENTAL RESULTS 18

25 4 EXPERIMENTAL RESULTS 19 Perhaps the most success for MINLAGA was realized on the five random graphs. The solutions produced by MINLAGA were all well within one percent of the best known solutions which were all produced by simulated annealing. Furthermore, MINLAGA s solutions were always better than those produced by any of the other algorithms tested. Unfortunately, MINLAGA consistently took longer to run than simulated annealing, though never more than three times as long. We suspect that on random graphs larger than those in the test set, MINLAGA would have more competitive running times. The random graphs in the test set all had exactly 1000 vertices, but MINLAGA generally had superior running times only for graphs of more than 1000 vertices. MINLAGA was also relatively successful at producing good solutions for the VLSI design graphs. Though the solution quality ranged from 7.34% to 19.52% above those produced by simulated annealing, they were again consistently better than those produced by any other algorithm tested. Furthermore, for four of the five graphs in this class, MINLAGA showed running times superior to those of simulated annealing. These four graphs had from 980 up to 1366 vertices. Other algorithms in addition to simulated annealing began to surpass MINLAGA on some of the graph drawing competition graphs. However, on one graph from this class MINLAGA consistently found the best solution over all, beating its nearest competitor by over 3%. That graph was gd96b. For the rest of the graphs in this class, MINLAGA was usually within 3% of the best solutions produced by simulated annealing. An exception was gd96a, for which MINLAGA was 8.24% away. It is also noteworthy that for gd95c, the best solution was produced by one of the hill-climbing algorithms, not simulated annealing. Its solution of 395 was far better than any of the others. With respect to running time, MINLAGA was once again the slowest algorithm for each graph. For the largest, however, MINLAGA was only about 8% slower. That graph had 1076 vertices. Again, we suspect that for larger graphs, MINLAGA would overtake simulated annealing. MINLAGA did not perform very well on the graphs from the finite element

26 4 EXPERIMENTAL RESULTS 20 discretization class. It is interesting to note that for three out of the four graphs in this class, spectral sequencing produced the best solutions, and for the fourth, it was just behind simulated annealing. This leads us to believe that there is something inherent in the structure of these graphs that makes them particularly good candidates for spectral sequencing. One positive note is that MINLAGA s running times for each of these graphs was significantly less than the running times of simulated annealing. Having at least 4253 veritices, all of the graphs in this class were quite large. Finally, for the graphs with known optimal solutions, MINLAGA s success varied by graph. For hc10, two algorithms tied simulated annealing in finding the optimal solution. MINLAGA was only 0.05% behind these algorithms, but took more than twice as long as simulated annealing. For mesh33x33, MINLAGA ran in significanly less time than simulated annealing, but came in behind several of the other algorithms with respect to solution quality. For bintree10, MINLAGA regained its status of being able to find a solution superior to all but that produced by simulated annealing. It also ran only 13% slower than simulated annealing. It should be noted that for this graph, none of the heuristic algorithms tested came very close to the optimal linear arrangement.

27 Table 5: Results from Other Algorithms Graph Normal Random Spectral Sequencing Hill Climbing E Hill Climbing 2 Hill Climbing 3 randoma randoma randoma randoma randomg c1y c2y c3y c4y c5y gd95c gd96a gd96b gd96c gd96d elt airfoil crack whitaker hc mesh33x bintree EXPERIMENTAL RESULTS 21

28 Table 6: Results from Other Algorithms Graph Greedy Normal Greedy Random Greedy RBS Greedy BFS Greedy DFS randoma randoma randoma randoma randomg c1y c2y c3y c4y c5y gd95c gd96a gd96b gd96c gd96d elt airfoil crack whitaker hc mesh33x bintree EXPERIMENTAL RESULTS 22

29 5 CONCLUSION 23 5 Conclusion In this paper we have presented a hybrid genetic algorithm for the minimum linear arrangement problem. Our algorithm, /MINLAGA/ performed well with respect to solution quality and running time compared to other algorithms tested. During our experimental research, we noticed that the algorithm could be tuned to perform better for specific classes of graphs. The experimental results included in this paper, however, reflect a tuning that produced the best results over all for the diverse classes of graphs in the test set. One area for further study is an algorithm that is self-tuning based on the characteristics of the input graph. Since the local optimization step made up a large part of the running time of our algorithm, another area for further study is alternate local optimization schemes. A faster local optimizer could allow the algorithm to use higher stagnancy numbers, leading to more generations of the GA without the corresponding increase in running time. This may lead to a combination of superior results and superior running times. Early experimentation with weaker, faster local optimizers proved highly detrimental to solution quality, however. More research into the area of fast, yet strong, local optimization is warranted. One final area worthy of further exploration is the encoding scheme. Since the chromosomes are permutations, rather than the more common 0-1 strings, crossover is a somewhat time-consuming process. Perhaps more complex encoding schemes could simplify crossover. The fitness function, as well, might benefit from an ecoding that stores information beyond just a permutation of the vertex set. References [1] D. Adolphson, Single Machine Job Sequencing with Precedence Constraints, SIAM Journal on Computing, 6, 1977, pp

30 REFERENCES 24 [2] D. Adolphson and T.C. Hu, Optimal Linear Ordering, SIAM Journal on Applied Mathematics, 25(3), November 1973, pp [3] T.N. Bui and P.H. Eppley, A Hybrid Genetic Algorithm for the Maximum Clique Problem, Procedings of the Sixth International Conference on Genetic Algorithms, 1995, pp [4] T.N. Bui and B.R. Moon, On Multi-Dimensional Encoding/Crossover, Proceedings of the Sixth International Conference on Genetic Algorithms, 1995, pp [5] A. Frieze and R. Kannan, The Regularity Lemma and Approximate Schemes for Dense Problems, 37th IEEE Symposium on Foundations of Computer Science, 1996, pp [6] L.H. Harper, Chassis Layout and Isoperimetric Problems, Technical Report SPS 37-66, vol II, Jet Propulsion Laboratory, September [7] J.H. Holland, Adaptation in Natural and Artificial Systems, University of Michigan Press, Ann Arbor, [8] M. Juvan and B. Mohar, Optimal Linear Labelings and Eigenvalues of Graphs, Discrete Applied Mathematics, 36(2), 1992, pp [9] S. Kirkpatrick, C.D. Gelatt, and M.P. Vecchi, Optimization by Simulated Annealing, Science, 220, May, 1983, pp [10] G. Mitchison and R. Durbin, Optimal Numberings of an n n Array, SIAM Journal on Algebraic and Discrete Methods, 7(4), 1986, pp [11] J. Pach, F. Shahrokhi, and M. Szegedy, Applications of the Crossing Number, Algoritmica, 1996, pp [12] J. Petit i Silvestre, Approximation Heiristics and Benchmarkings for the MinLA Problem, Proceedings of Algorithms and Experiments, 1998, pp

31 REFERENCES 25 [13] J. Petit i Silvestre, Experiments on the Minimum Linear Arrangement Problem, Report de recerca LSI R, Department de Llenguatges i Sistemes Informàtics, Univeritat Polytècnica de Catalunya, 2001, available at jpetit/publications. [14] R. Ravi, A. Agrawal, and P. Klein, Ordering Problems Approximated: Single-Processor Scheduling and Interval Graph Completion, 18th International Colloquium on Automata, Languages and Programming, volume 510 of Lecture Notes in Computer Science, Springer-Verlag, 1991, pp

Genetic Algorithm for Circuit Partitioning

Genetic Algorithm for Circuit Partitioning Genetic Algorithm for Circuit Partitioning ZOLTAN BARUCH, OCTAVIAN CREŢ, KALMAN PUSZTAI Computer Science Department, Technical University of Cluj-Napoca, 26, Bariţiu St., 3400 Cluj-Napoca, Romania {Zoltan.Baruch,

More information

Escaping Local Optima: Genetic Algorithm

Escaping Local Optima: Genetic Algorithm Artificial Intelligence Escaping Local Optima: Genetic Algorithm Dae-Won Kim School of Computer Science & Engineering Chung-Ang University We re trying to escape local optima To achieve this, we have learned

More information

5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing. 6. Meta-heuristic Algorithms and Rectangular Packing

5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing. 6. Meta-heuristic Algorithms and Rectangular Packing 1. Introduction 2. Cutting and Packing Problems 3. Optimisation Techniques 4. Automated Packing Techniques 5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing 6.

More information

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES SHIHADEH ALQRAINY. Department of Software Engineering, Albalqa Applied University. E-mail:

More information

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Evolutionary Computation Algorithms for Cryptanalysis: A Study Evolutionary Computation Algorithms for Cryptanalysis: A Study Poonam Garg Information Technology and Management Dept. Institute of Management Technology Ghaziabad, India pgarg@imt.edu Abstract The cryptanalysis

More information

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ OUTLINE Optimisation problems Optimisation & search Two Examples The knapsack problem

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 2 Outline Local search techniques and optimization Hill-climbing

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 Outline Local search techniques and optimization Hill-climbing

More information

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 2 Outline Local search techniques and optimization Hill-climbing

More information

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search A JOB-SHOP SCHEDULING PROBLEM (JSSP) USING GENETIC ALGORITHM (GA) Mahanim Omar, Adam Baharum, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains Malaysia 11800 Penang, Malaysia Tel: (+)

More information

CHAPTER 4 GENETIC ALGORITHM

CHAPTER 4 GENETIC ALGORITHM 69 CHAPTER 4 GENETIC ALGORITHM 4.1 INTRODUCTION Genetic Algorithms (GAs) were first proposed by John Holland (Holland 1975) whose ideas were applied and expanded on by Goldberg (Goldberg 1989). GAs is

More information

GENETIC ALGORITHM with Hands-On exercise

GENETIC ALGORITHM with Hands-On exercise GENETIC ALGORITHM with Hands-On exercise Adopted From Lecture by Michael Negnevitsky, Electrical Engineering & Computer Science University of Tasmania 1 Objective To understand the processes ie. GAs Basic

More information

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Necmettin Kaya Uludag University, Mechanical Eng. Department, Bursa, Turkey Ferruh Öztürk Uludag University, Mechanical Eng. Department,

More information

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Informed Search and Exploration Chapter 4 (4.3 4.6) Searching: So Far We ve discussed how to build goal-based and utility-based agents that search to solve problems We ve also presented

More information

Genetic Algorithms Variations and Implementation Issues

Genetic Algorithms Variations and Implementation Issues Genetic Algorithms Variations and Implementation Issues CS 431 Advanced Topics in AI Classic Genetic Algorithms GAs as proposed by Holland had the following properties: Randomly generated population Binary

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Local Search Vibhav Gogate The University of Texas at Dallas Some material courtesy of Luke Zettlemoyer, Dan Klein, Dan Weld, Alex Ihler, Stuart Russell, Mausam Systematic Search:

More information

A Genetic Algorithm Applied to Graph Problems Involving Subsets of Vertices

A Genetic Algorithm Applied to Graph Problems Involving Subsets of Vertices A Genetic Algorithm Applied to Graph Problems Involving Subsets of Vertices Yaser Alkhalifah Roger L. Wainwright Department of Mathematical Department of Mathematical and Computer Sciences and Computer

More information

SIMULATED ANNEALING TECHNIQUES AND OVERVIEW. Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA

SIMULATED ANNEALING TECHNIQUES AND OVERVIEW. Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA SIMULATED ANNEALING TECHNIQUES AND OVERVIEW Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA 1. INTRODUCTION Simulated annealing is a global optimization algorithm

More information

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem etic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem R. O. Oladele Department of Computer Science University of Ilorin P.M.B. 1515, Ilorin, NIGERIA

More information

4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction

4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction 4/22/24 s Diwakar Yagyasen Department of Computer Science BBDNITM Visit dylycknow.weebly.com for detail 2 The basic purpose of a genetic algorithm () is to mimic Nature s evolutionary approach The algorithm

More information

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007)

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007) In the name of God Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm Spring 2009 Instructor: Dr. Masoud Yaghini Outlines Problem Definition Modeling As A Set Partitioning

More information

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II Genetic Algorithms Genetic Algorithms Iterative method for doing optimization Inspiration from biology General idea (see Pang or Wikipedia for more details): Create a collection of organisms/individuals

More information

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? Gurjit Randhawa Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? This would be nice! Can it be done? A blind generate

More information

Chapter 14 Global Search Algorithms

Chapter 14 Global Search Algorithms Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.

More information

A Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron

A Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron A Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron Kiat-Choong Chen Ian Hsieh Cao An Wang Abstract A minimum tetrahedralization of a convex polyhedron is a partition of the convex

More information

Using Genetic Algorithms in Integer Programming for Decision Support

Using Genetic Algorithms in Integer Programming for Decision Support Doi:10.5901/ajis.2014.v3n6p11 Abstract Using Genetic Algorithms in Integer Programming for Decision Support Dr. Youcef Souar Omar Mouffok Taher Moulay University Saida, Algeria Email:Syoucef12@yahoo.fr

More information

Metaheuristic Optimization with Evolver, Genocop and OptQuest

Metaheuristic Optimization with Evolver, Genocop and OptQuest Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:

More information

The Genetic Algorithm for finding the maxima of single-variable functions

The Genetic Algorithm for finding the maxima of single-variable functions Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding

More information

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Evolutionary Algorithms. CS Evolutionary Algorithms 1 Evolutionary Algorithms CS 478 - Evolutionary Algorithms 1 Evolutionary Computation/Algorithms Genetic Algorithms l Simulate natural evolution of structures via selection and reproduction, based on performance

More information

Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem

Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem Bindu Student, JMIT Radaur binduaahuja@gmail.com Mrs. Pinki Tanwar Asstt. Prof, CSE, JMIT Radaur pinki.tanwar@gmail.com Abstract

More information

Artificial Intelligence Application (Genetic Algorithm)

Artificial Intelligence Application (Genetic Algorithm) Babylon University College of Information Technology Software Department Artificial Intelligence Application (Genetic Algorithm) By Dr. Asaad Sabah Hadi 2014-2015 EVOLUTIONARY ALGORITHM The main idea about

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Part 10: Genetic Algorithm Basics Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic

More information

Genetic Algorithms. Kang Zheng Karl Schober

Genetic Algorithms. Kang Zheng Karl Schober Genetic Algorithms Kang Zheng Karl Schober Genetic algorithm What is Genetic algorithm? A genetic algorithm (or GA) is a search technique used in computing to find true or approximate solutions to optimization

More information

An Introduction to Evolutionary Algorithms

An Introduction to Evolutionary Algorithms An Introduction to Evolutionary Algorithms Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology Karthik.sindhya@jyu.fi http://users.jyu.fi/~kasindhy/

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

Introduction to Genetic Algorithms

Introduction to Genetic Algorithms Advanced Topics in Image Analysis and Machine Learning Introduction to Genetic Algorithms Week 3 Faculty of Information Science and Engineering Ritsumeikan University Today s class outline Genetic Algorithms

More information

A Hybrid Genetic Algorithm for a Variant of Two-Dimensional Packing Problem

A Hybrid Genetic Algorithm for a Variant of Two-Dimensional Packing Problem A Hybrid Genetic Algorithm for a Variant of Two-Dimensional Packing Problem ABSTRACT Jin Kim School of Computer Science and Engineering Seoul National University 599 Gwanak-ro, Gwanak-gu, Seoul 151-744,

More information

Genetic Algorithms For Vertex. Splitting in DAGs 1

Genetic Algorithms For Vertex. Splitting in DAGs 1 Genetic Algorithms For Vertex Splitting in DAGs 1 Matthias Mayer 2 and Fikret Ercal 3 CSC-93-02 Fri Jan 29 1993 Department of Computer Science University of Missouri-Rolla Rolla, MO 65401, U.S.A. (314)

More information

Non-deterministic Search techniques. Emma Hart

Non-deterministic Search techniques. Emma Hart Non-deterministic Search techniques Emma Hart Why do local search? Many real problems are too hard to solve with exact (deterministic) techniques Modern, non-deterministic techniques offer ways of getting

More information

Computational Intelligence

Computational Intelligence Computational Intelligence Module 6 Evolutionary Computation Ajith Abraham Ph.D. Q What is the most powerful problem solver in the Universe? ΑThe (human) brain that created the wheel, New York, wars and

More information

Using Genetic Algorithms to optimize ACS-TSP

Using Genetic Algorithms to optimize ACS-TSP Using Genetic Algorithms to optimize ACS-TSP Marcin L. Pilat and Tony White School of Computer Science, Carleton University, 1125 Colonel By Drive, Ottawa, ON, K1S 5B6, Canada {mpilat,arpwhite}@scs.carleton.ca

More information

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem Proc. 1 st International Conference on Machine Learning and Data Engineering (icmlde2017) 20-22 Nov 2017, Sydney, Australia ISBN: 978-0-6480147-3-7 Using Genetic Algorithm with Triple Crossover to Solve

More information

Time Complexity Analysis of the Genetic Algorithm Clustering Method

Time Complexity Analysis of the Genetic Algorithm Clustering Method Time Complexity Analysis of the Genetic Algorithm Clustering Method Z. M. NOPIAH, M. I. KHAIRIR, S. ABDULLAH, M. N. BAHARIN, and A. ARIFIN Department of Mechanical and Materials Engineering Universiti

More information

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution

More information

Advanced Search Genetic algorithm

Advanced Search Genetic algorithm Advanced Search Genetic algorithm Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison [Based on slides from Jerry Zhu, Andrew Moore http://www.cs.cmu.edu/~awm/tutorials

More information

Comparison of TSP Algorithms

Comparison of TSP Algorithms Comparison of TSP Algorithms Project for Models in Facilities Planning and Materials Handling December 1998 Participants: Byung-In Kim Jae-Ik Shim Min Zhang Executive Summary Our purpose in this term project

More information

Algorithm Design (4) Metaheuristics

Algorithm Design (4) Metaheuristics Algorithm Design (4) Metaheuristics Takashi Chikayama School of Engineering The University of Tokyo Formalization of Constraint Optimization Minimize (or maximize) the objective function f(x 0,, x n )

More information

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis and Dimensioning II Department of Electronics and Communications Engineering Tampere University of Technology, Tampere, Finland March 19, 2014

More information

Genetic Algorithm for Dynamic Capacitated Minimum Spanning Tree

Genetic Algorithm for Dynamic Capacitated Minimum Spanning Tree 28 Genetic Algorithm for Dynamic Capacitated Minimum Spanning Tree 1 Tanu Gupta, 2 Anil Kumar 1 Research Scholar, IFTM, University, Moradabad, India. 2 Sr. Lecturer, KIMT, Moradabad, India. Abstract Many

More information

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi

More information

Khushboo Arora, Samiksha Agarwal, Rohit Tanwar

Khushboo Arora, Samiksha Agarwal, Rohit Tanwar International Journal of Scientific & Engineering Research, Volume 7, Issue 1, January-2016 1014 Solving TSP using Genetic Algorithm and Nearest Neighbour Algorithm and their Comparison Khushboo Arora,

More information

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION Nedim TUTKUN nedimtutkun@gmail.com Outlines Unconstrained Optimization Ackley s Function GA Approach for Ackley s Function Nonlinear Programming Penalty

More information

Grid Scheduling Strategy using GA (GSSGA)

Grid Scheduling Strategy using GA (GSSGA) F Kurus Malai Selvi et al,int.j.computer Technology & Applications,Vol 3 (5), 8-86 ISSN:2229-693 Grid Scheduling Strategy using GA () Dr.D.I.George Amalarethinam Director-MCA & Associate Professor of Computer

More information

Evolving SQL Queries for Data Mining

Evolving SQL Queries for Data Mining Evolving SQL Queries for Data Mining Majid Salim and Xin Yao School of Computer Science, The University of Birmingham Edgbaston, Birmingham B15 2TT, UK {msc30mms,x.yao}@cs.bham.ac.uk Abstract. This paper

More information

A Modified Genetic Algorithm for Process Scheduling in Distributed System

A Modified Genetic Algorithm for Process Scheduling in Distributed System A Modified Genetic Algorithm for Process Scheduling in Distributed System Vinay Harsora B.V.M. Engineering College Charatar Vidya Mandal Vallabh Vidyanagar, India Dr.Apurva Shah G.H.Patel College of Engineering

More information

Introduction to Evolutionary Computation

Introduction to Evolutionary Computation Introduction to Evolutionary Computation The Brought to you by (insert your name) The EvoNet Training Committee Some of the Slides for this lecture were taken from the Found at: www.cs.uh.edu/~ceick/ai/ec.ppt

More information

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery Monika Sharma 1, Deepak Sharma 2 1 Research Scholar Department of Computer Science and Engineering, NNSS SGI Samalkha,

More information

A Simple Placement and Routing Algorithm for a Two-Dimensional Computational Origami Architecture

A Simple Placement and Routing Algorithm for a Two-Dimensional Computational Origami Architecture A Simple Placement and Routing Algorithm for a Two-Dimensional Computational Origami Architecture Robert S. French April 5, 1989 Abstract Computational origami is a parallel-processing concept in which

More information

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This

More information

A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen 2, b

A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen 2, b International Conference on Information Technology and Management Innovation (ICITMI 2015) A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen

More information

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'

More information

Finding Maximum Clique with a Genetic Algorithm

Finding Maximum Clique with a Genetic Algorithm The Pennsylvania State University The Graduate School Capital College Finding Maximum Clique with a Genetic Algorithm A Master s Paper in Computer Science by Bo Huang c 2002 Bo Huang Submitted in Partial

More information

A GENETIC ALGORITHM FOR CLUSTERING ON VERY LARGE DATA SETS

A GENETIC ALGORITHM FOR CLUSTERING ON VERY LARGE DATA SETS A GENETIC ALGORITHM FOR CLUSTERING ON VERY LARGE DATA SETS Jim Gasvoda and Qin Ding Department of Computer Science, Pennsylvania State University at Harrisburg, Middletown, PA 17057, USA {jmg289, qding}@psu.edu

More information

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing Solving Traveling Salesman Problem Using Parallel Genetic Algorithm and Simulated Annealing Fan Yang May 18, 2010 Abstract The traveling salesman problem (TSP) is to find a tour of a given number of cities

More information

Memetic Algorithms for the MinLA Problem?

Memetic Algorithms for the MinLA Problem? Memetic Algorithms for the MinLA Problem? Eduardo Rodriguez-Tello 1,Jin-KaoHao 1, and Jose Torres-Jimenez 2 1 LERIA, Université d Angers. 2 Boulevard Lavoisier, 49045 Angers, France {ertello, hao}@info.univ-angers.fr

More information

AN EVOLUTIONARY APPROACH TO DISTANCE VECTOR ROUTING

AN EVOLUTIONARY APPROACH TO DISTANCE VECTOR ROUTING International Journal of Latest Research in Science and Technology Volume 3, Issue 3: Page No. 201-205, May-June 2014 http://www.mnkjournals.com/ijlrst.htm ISSN (Online):2278-5299 AN EVOLUTIONARY APPROACH

More information

HEURISTICS FOR THE NETWORK DESIGN PROBLEM

HEURISTICS FOR THE NETWORK DESIGN PROBLEM HEURISTICS FOR THE NETWORK DESIGN PROBLEM G. E. Cantarella Dept. of Civil Engineering University of Salerno E-mail: g.cantarella@unisa.it G. Pavone, A. Vitetta Dept. of Computer Science, Mathematics, Electronics

More information

A HYBRID GENETIC ALGORITHM A NEW APPROACH TO SOLVE TRAVELING SALESMAN PROBLEM

A HYBRID GENETIC ALGORITHM A NEW APPROACH TO SOLVE TRAVELING SALESMAN PROBLEM A HYBRID GENETIC ALGORITHM A NEW APPROACH TO SOLVE TRAVELING SALESMAN PROBLEM G.ANDAL JAYALAKSHMI Computer Science and Engineering Department, Thiagarajar College of Engineering, Madurai, Tamilnadu, India

More information

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem Richard E. Mowe Department of Statistics St. Cloud State University mowe@stcloudstate.edu Bryant A. Julstrom Department

More information

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA The Binary Genetic Algorithm Universidad de los Andes-CODENSA 1. Genetic Algorithms: Natural Selection on a Computer Figure 1 shows the analogy between biological i l evolution and a binary GA. Both start

More information

Ar#ficial)Intelligence!!

Ar#ficial)Intelligence!! Introduc*on! Ar#ficial)Intelligence!! Roman Barták Department of Theoretical Computer Science and Mathematical Logic We know how to use heuristics in search BFS, A*, IDA*, RBFS, SMA* Today: What if the

More information

Aero-engine PID parameters Optimization based on Adaptive Genetic Algorithm. Yinling Wang, Huacong Li

Aero-engine PID parameters Optimization based on Adaptive Genetic Algorithm. Yinling Wang, Huacong Li International Conference on Applied Science and Engineering Innovation (ASEI 215) Aero-engine PID parameters Optimization based on Adaptive Genetic Algorithm Yinling Wang, Huacong Li School of Power and

More information

Genetic Programming: A study on Computer Language

Genetic Programming: A study on Computer Language Genetic Programming: A study on Computer Language Nilam Choudhary Prof.(Dr.) Baldev Singh Er. Gaurav Bagaria Abstract- this paper describes genetic programming in more depth, assuming that the reader is

More information

Automata Construct with Genetic Algorithm

Automata Construct with Genetic Algorithm Automata Construct with Genetic Algorithm Vít Fábera Department of Informatics and Telecommunication, Faculty of Transportation Sciences, Czech Technical University, Konviktská 2, Praha, Czech Republic,

More information

Partitioning Sets with Genetic Algorithms

Partitioning Sets with Genetic Algorithms From: FLAIRS-00 Proceedings. Copyright 2000, AAAI (www.aaai.org). All rights reserved. Partitioning Sets with Genetic Algorithms William A. Greene Computer Science Department University of New Orleans

More information

1 Lab + Hwk 5: Particle Swarm Optimization

1 Lab + Hwk 5: Particle Swarm Optimization 1 Lab + Hwk 5: Particle Swarm Optimization This laboratory requires the following equipment: C programming tools (gcc, make), already installed in GR B001 Webots simulation software Webots User Guide Webots

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Approximation Algorithms and Heuristics November 21, 2016 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff Inria Saclay Ile-de-France 2 Exercise: The Knapsack

More information

Optimal tree for Genetic Algorithms in the Traveling Salesman Problem (TSP).

Optimal tree for Genetic Algorithms in the Traveling Salesman Problem (TSP). Optimal tree for Genetic Algorithms in the Traveling Salesman Problem (TSP). Liew Sing liews_ryan@yahoo.com.sg April 1, 2012 Abstract In this paper, the author proposes optimal tree as a gauge for the

More information

Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems

Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems Tony Maciejewski, Kyle Tarplee, Ryan Friese, and Howard Jay Siegel Department of Electrical and Computer Engineering Colorado

More information

Evolutionary Computation Part 2

Evolutionary Computation Part 2 Evolutionary Computation Part 2 CS454, Autumn 2017 Shin Yoo (with some slides borrowed from Seongmin Lee @ COINSE) Crossover Operators Offsprings inherit genes from their parents, but not in identical

More information

1 Lab 5: Particle Swarm Optimization

1 Lab 5: Particle Swarm Optimization 1 Lab 5: Particle Swarm Optimization This laboratory requires the following: (The development tools are installed in GR B0 01 already): C development tools (gcc, make, etc.) Webots simulation software

More information

Genetic algorithm based on number of children and height task for multiprocessor task Scheduling

Genetic algorithm based on number of children and height task for multiprocessor task Scheduling Genetic algorithm based on number of children and height task for multiprocessor task Scheduling Marjan Abdeyazdan 1,Vahid Arjmand 2,Amir masoud Rahmani 3, Hamid Raeis ghanavati 4 1 Department of Computer

More information

Optimization of Association Rule Mining through Genetic Algorithm

Optimization of Association Rule Mining through Genetic Algorithm Optimization of Association Rule Mining through Genetic Algorithm RUPALI HALDULAKAR School of Information Technology, Rajiv Gandhi Proudyogiki Vishwavidyalaya Bhopal, Madhya Pradesh India Prof. JITENDRA

More information

Mutations for Permutations

Mutations for Permutations Mutations for Permutations Insert mutation: Pick two allele values at random Move the second to follow the first, shifting the rest along to accommodate Note: this preserves most of the order and adjacency

More information

A Genetic Algorithm for Multiprocessor Task Scheduling

A Genetic Algorithm for Multiprocessor Task Scheduling A Genetic Algorithm for Multiprocessor Task Scheduling Tashniba Kaiser, Olawale Jegede, Ken Ferens, Douglas Buchanan Dept. of Electrical and Computer Engineering, University of Manitoba, Winnipeg, MB,

More information

A Genetic Algorithm Framework

A Genetic Algorithm Framework Fast, good, cheap. Pick any two. The Project Triangle 3 A Genetic Algorithm Framework In this chapter, we develop a genetic algorithm based framework to address the problem of designing optimal networks

More information

Similarity Templates or Schemata. CS 571 Evolutionary Computation

Similarity Templates or Schemata. CS 571 Evolutionary Computation Similarity Templates or Schemata CS 571 Evolutionary Computation Similarities among Strings in a Population A GA has a population of strings (solutions) that change from generation to generation. What

More information

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS 6.1 Introduction Gradient-based algorithms have some weaknesses relative to engineering optimization. Specifically, it is difficult to use gradient-based algorithms

More information

Introduction to Computer Science and Programming for Astronomers

Introduction to Computer Science and Programming for Astronomers Introduction to Computer Science and Programming for Astronomers Lecture 9. István Szapudi Institute for Astronomy University of Hawaii March 21, 2018 Outline Reminder 1 Reminder 2 3 Reminder We have demonstrated

More information

An Evolutionary Algorithm with Stochastic Hill-Climbing for the Edge-Biconnectivity Augmentation Problem

An Evolutionary Algorithm with Stochastic Hill-Climbing for the Edge-Biconnectivity Augmentation Problem An Evolutionary Algorithm with Stochastic Hill-Climbing for the Edge-Biconnectivity Augmentation Problem Ivana Ljubić and Günther R. Raidl Institute for Computer Graphics and Algorithms, Vienna University

More information

HEURISTIC OPTIMIZATION USING COMPUTER SIMULATION: A STUDY OF STAFFING LEVELS IN A PHARMACEUTICAL MANUFACTURING LABORATORY

HEURISTIC OPTIMIZATION USING COMPUTER SIMULATION: A STUDY OF STAFFING LEVELS IN A PHARMACEUTICAL MANUFACTURING LABORATORY Proceedings of the 1998 Winter Simulation Conference D.J. Medeiros, E.F. Watson, J.S. Carson and M.S. Manivannan, eds. HEURISTIC OPTIMIZATION USING COMPUTER SIMULATION: A STUDY OF STAFFING LEVELS IN A

More information

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach 1 Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach David Greiner, Gustavo Montero, Gabriel Winter Institute of Intelligent Systems and Numerical Applications in Engineering (IUSIANI)

More information

CHAPTER 1 at a glance

CHAPTER 1 at a glance CHAPTER 1 at a glance Introduction to Genetic Algorithms (GAs) GA terminology Genetic operators Crossover Mutation Inversion EDA problems solved by GAs 1 Chapter 1 INTRODUCTION The Genetic Algorithm (GA)

More information

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN 97 CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN 5.1 INTRODUCTION Fuzzy systems have been applied to the area of routing in ad hoc networks, aiming to obtain more adaptive and flexible

More information

Network Routing Protocol using Genetic Algorithms

Network Routing Protocol using Genetic Algorithms International Journal of Electrical & Computer Sciences IJECS-IJENS Vol:0 No:02 40 Network Routing Protocol using Genetic Algorithms Gihan Nagib and Wahied G. Ali Abstract This paper aims to develop a

More information

An experimental evaluation of a parallel genetic algorithm using MPI

An experimental evaluation of a parallel genetic algorithm using MPI 2009 13th Panhellenic Conference on Informatics An experimental evaluation of a parallel genetic algorithm using MPI E. Hadjikyriacou, N. Samaras, K. Margaritis Dept. of Applied Informatics University

More information

Applied Cloning Techniques for a Genetic Algorithm Used in Evolvable Hardware Design

Applied Cloning Techniques for a Genetic Algorithm Used in Evolvable Hardware Design Applied Cloning Techniques for a Genetic Algorithm Used in Evolvable Hardware Design Viet C. Trinh vtrinh@isl.ucf.edu Gregory A. Holifield greg.holifield@us.army.mil School of Electrical Engineering and

More information