ABSTRACT. problems including time scheduling, frequency assignment, register allocation, and

Size: px
Start display at page:

Download "ABSTRACT. problems including time scheduling, frequency assignment, register allocation, and"

Transcription

1 ABSTRACT The graph coloring problem is a practical method of representing many real world problems including time scheduling, frequency assignment, register allocation, and circuit board testing. The most important fact that makes graph coloring exciting is that finding the minimum number of colors for an arbitrary graph is NP-hard. This project implements combinatorial optimization algorithms (genetic algorithm (GA), simulated annealing algorithm (SA)) and sequential/greedy algorithms (highest order algorithm (HSO) and the sequential algorithm (SQ)) to find a solution for the graph coloring problem. Decision Fusion is then applied on the implemented algorithms to find an optimized solution. Applying decision fusion on the algorithms signifies the importance given to the factors such as the time of execution and availability of processing resources. Extensive testing and evaluations has been performed to determine important parameters and the efficiency of the simulator. The simulator has always produced better results than the SA, GA, SQ and HSO implemented independently. - ii -

2 TABLE OF CONTENTS ABSTRACT... ii TABLE OF CONTENTS...iii LIST OF TABLES... vi LIST OF FIGURES...viii 1. BACKGROUND AND RATIONALE Graph Coloring Significance of Graph Coloring Coloring a Graph Algorithms to Implement Graph Coloring The Sequential/Greedy Algorithm Genetic Algorithm Simulated Annealing Algorithm Highest Order Algorithm Information / Decision Fusion IMPLEMENTING THE GRAPH COLORING PROBLEM AND OPTIMIZING THE SOLUTION USING DECISION FUSION Implementation of Algorithms Simulator and Decision Fusion Rules SYSTEM DESIGN Implementation of Algorithms Sequential Algorithm Genetic Algorithm iii -

3 A representation of an individual and initial population: Fitness Function: Reproduction Methods Selection Criteria: Terminating Conditions: Pseudo-code algorithm Simulated Annealing Algorithm Description of System Configuration Generator for Random Changes in Configuration Objective Function Control parameters Pseudo-code Algorithm Highest Order Algorithm Decision Fusion Committee Method Optimizing the Solution EVALUATION AND RESULTS Evaluation and Testing Testing the Implementation of Algorithms Testing the Simulator Results Evaluations on Implementations of Algorithms Genetic Algorithm Implementation iv -

4 GA Implementation: Mutation Rate GA Implementation: Crossover Length GA Implementation: Genetic Operator Ratio Simulated Annealing Algorithm Implementation: SA Algorithm Implementation: Reproduction Procedure SA Implementation: Number of Reproductions Simulator Simulator: Average Solutions Comparison for Decision Fusion Rule Simulator: Vote Comparison for Decision Fusion Rule Simulator: Evaluating the Efficiency with Decision Fusion Rules Simulator: Solution Comparisons FUTURE WORK CONCLUSION ACKNOWLEDGEMENTS Bibliography and References v -

5 LIST OF TABLES Table 3.1 Implementation of Greedy/Sequential Algorithm Table 3.2 Implementation of Genetic Algorithm Table 3.3 Implementation of Simulated Annealing Algorithm Table 3.4 Implementation of Highest Order Algorithm Table 3.5 Simulator Table 4.1 Comparison of Colors Generated for Different Mutation Rates Table 4.2 Comparison of Colors Generated for Different Crossover Lengths Table 4.3 Comparison of Colors Generated for Different Genetic Operator Ratios Table 4.4 Comparison of Reproduction Procedures with Constant Iterations Table 4.5 Comparison of Reproduction Procedures with Varying Iterations Table 4.6 Comparison of Start Temperature Table 4.7 Average Solution for Each Execution until 100% ATE Table 4.8 Comparison of Average Solutions until 100% ATE Table 4.9 Average Solution for Each Execution until 50% ATE Table 4.10 Comparison of Average Solutions until 50% ATE Table 4.11 Comparison of Average Votes After 100% ATE Table 4.12 Comparison of Average Votes After 50% ATE Table 4.13 Efficiency Evaluation of the Simulator for a Graph with 50 Vertices Table 4.14 Efficiency Evaluation of the Simulator for a Graph with 100 Vertices Table 4.15 Efficiency Evaluation of the Simulator for a Graph with 200 Vertices Table 4.16 Efficiency Evaluation of the Simulator for a Graph with 500 Vertices Table 4.17 Efficiency of the Simulator with Decision Fusion Rules vi -

6 Table 4.18 Comparison of Solutions of GA, SA and Simulator vii -

7 LIST OF FIGURES Figure 1.1 Graph with 6 Vertices and 9 Edges... 2 Figure 1.2 Genetic Algorithm Flow Chart... 5 Figure 1.3 A Canonical Architecture for Information Fusion... 8 Figure 2.1 A Screen Shot of Greedy Algorithm Results Figure 2.2 A Screen Shot of Highest Order Algorithm Results Figure 2.3 A Screen Shot of Genetic Algorithm Results Figure 2.4 A Screen Shot of Simulated Annealing Algorithm Results Figure 4.2 A Screen Shot to Test Greedy Algorithm Implementation Figure 4.3 A Screen Shot to Test Greedy Algorithm Implementation Figure 4.4 A Screen Shot to Test Highest Order Algorithm Implementation Figure 4.5 A Screen Shot to Test Greedy Algorithm Implementation Figure 4.6 A Screen Shot to Test Mutation Operation in Genetic Algorithm Figure 4.7 A Screen Shot to Test Mutation Operation in Genetic Algorithm Figure 4.8 A Screen Shot to Test Genetic Algorithm Implementation Figure 4.9 A Screen Shot to Test Inversion Operation in Simulated Annealing Figure 4.10 A Screen Shot to Test Translation Operation in Simulated Annealing Figure 4.11 A Screen Shot to Test Simulated Algorithm Implementation Figure 4.12 A Screen Shot to Test Integration in the Simulator Figure 4.13 A Screen Shot of Black Box Test of the Simulator Figure 4.14 A Comparison of Mutation Rate Parameter Figure 4.15 A Comparison of Crossover Length Parameter Figure 4.16 A Comparison of Genetic Operation Ratio (Vertices < 200) viii -

8 Figure 4.17 A Comparison of Genetic Operation Ratio (Vertices < 500) Figure 4.18 A Comparison of Genetic Operation Ratio Figure 4.19 A Comparison of Reproduction Procedures with Varying Iterations Figure 4.20 A Comparison of Reproduction Procedures with Constant Iterations Figure 4.21 A Comparison of Start Temperature Figure 4.22 Average Solutions Comparison for Graph with 100 Vertices Figure 4.23 Average Solutions Comparison for Graph with 200 Vertices Figure 4.24 Average Solutions Comparison for Graph with 500 Vertices Figure 4.25 A Comparison of Average Votes Difference % (dp) Figure 4.26 A Comparison of Decision Fusion Efficiency Figure 4.27 A Comparison of GA, SA and Simulator Solutions ix -

9 1. BACKGROUND AND RATIONALE 1.1. Graph Coloring Graph Coloring is assigning "colors", to certain objects in a graph. Such objects can be vertices, edges, faces, or a mixture of the above. The significant object of all is the vertex coloring as all of other object problems can be transformed into a vertex version graph coloring problem [WFEGC 2006]. Graph coloring is a practical method of representing many real world problems including time scheduling, frequency assignment, register allocation, and circuit board testing [Whalen 2002] Significance of Graph Coloring The most important fact that makes graph coloring exciting is that finding the minimum number of colors for an arbitrary graph is NP-hard. Besides the classical types of problems, different limitations can also be set on the graph, or on the way a color is assigned, or on the number of colors to be used, or even on the color itself, making it even more complex [WFEGC 2006] Coloring a Graph Partitioning a set of objects into classes according to certain rules is a fundamental process in mathematics. A conceptually simple set of rules tells us for each pair of objects whether or not they are allowed in the same class [Jensen 1995]. The theory of graph coloring deals with exactly this situation. The objects form the set of vertices V(G) of a graph G, two vertices being joined by an edge in G whenever they are not allowed in the same class [Jensen 1995]

10 To distinguish the classes we use a set of colors C, and the division into classes is given by a coloring φ: φ: V(G) C, where φ(x) φ(y) for all x, y belonging to the set of edges E(G) of G [Jensen 1995]. If C has cardinality k, then φ is a k-coloring, and when k is finite, it is usually assumed that C = {1, 2, 3... k}. For i C the set φ -l (i) is the i th color class. Thus each color class forms an independent set of vertices; that is, no two of them are joined by an edge. The minimum cardinal k for which G has a k-coloring is the Chromatic Number X(G) of G, and G is X(G)-chromatic [Jensen 1995]. Figure 1.1 shows a graph with 9 edges joining 6 vertices and colored with the minimum number of colors so that no adjacent vertex has the same color Figure 1.1 Graph with 6 Vertices and 9 Edges For graph with vertex colored with different colors, the graph needs a minimum of 3 colors. In the Figure1.1 the numbering associated with each vertex is the color class assigned to it. For an arbitrary graph finding the minimum number of colors is an NPhard problem [Jensen 1995]

11 1.2. Algorithms to Implement Graph Coloring The Sequential/Greedy Algorithm The sequential/greedy algorithm always takes the best immediate, or local, solution sequentially while finding an answer, with out considering any criteria or knowledge base. These algorithms find the overall, or globally, optimal solution for some optimization problems, but may find less-than-optimal solutions for some instances of other problems [NIST 2006]. The sequential algorithm, also known as the greedy algorithm, has several variations specific to selected problems [Culberson 2002]. If there is no sequential algorithm that always finds the optimal solution for a problem, one may have to search (exponentially) many possible solutions to find the optimum one. These algorithms are usually quicker, since they don't consider the details of possible alternatives [NIST 2006] but need exponentially many steps to compute the solution Genetic Algorithm A genetic algorithm (GA) is an adaptive heuristic search algorithm/technique used to find approximate solutions to combinatorial optimization problems. They are a particular class of evolutionary algorithms that use techniques inspired by evolutionary biology such as inheritance, mutation, natural selection, and recombination (or crossover) [WFEGA 2006][TW 1996]. They belong to the class of stochastic search methods (other stochastic search methods include simulated annealing, threshold acceptance, and some forms of branch and bound). Whereas most stochastic search methods operate on a single solution to the problem at hand, genetic algorithms operate on a population of solutions - 3 -

12 [WM 2006]. The continuing price/performance improvements of computational systems have made them attractive for some types of optimization. In particular, genetic algorithms work very well on mixed (continuous and discrete), combinatorial problems. They are less susceptible to getting 'stuck' at local optima than gradient search methods. But they tend to be computationally expensive [WM 2006]. The basic concept of a GA is to simulate processes in natural system necessary for evolution, specifically those that follow the principles first laid down by Charles Darwin of survival of the fittest. As such they represent an intelligent exploitation of a random search within a defined search space to solve a problem [TW 1996]. Not only does GAs provide alternative method to solving problems, it consistently outperforms other traditional methods in most of the problems. Many of the real world problems involved finding optimal parameters, which might prove difficult for traditional methods but ideal for a GA. A GA presumes that the potential solution of any problem is an individual and can be represented by a set of parameters. These parameters are regarded as the genes of a chromosome and can be structured by a string of values in binary form [MKF 1999]. The next level population is generated based on the process of selection and reproduction of selected individuals through genetic operators; crossover and mutation. A positive value, generally known as a fitness value, is used to reflect the degree of goodness of the chromosome or the problem which would be highly related with its objective value [MKF 1999]. Figure 1.2 shows the consolidated flow chart with the generations, operators and fitness function

13 Figure 1.2 Genetic Algorithm Flow Chart Simulated Annealing Algorithm Simulated annealing (SA) is a generic probabilistic meta-algorithm for the global optimization problem, namely locating a good approximation to the global optimum of a given function in a large search space [WFESA 2006]. Defining informally, it is a technique for combinatorial optimization problems, such as minimizing functions of very many variables [Rutenbar 1989]. Simulated Annealing is an analogy with thermodynamics, specifically with the way that liquids freeze and crystallize, or metals cool and anneal. At high temperatures, the molecules of a liquid move freely with respect to one another. If the liquid is cooled slowly, thermal mobility is lost. The atoms are often able to line themselves up and form a pure crystal that is completely ordered over a distance up to billions of times the size of an individual atom in all directions. This crystal is the state of minimum energy for this - 5 -

14 system. The amazing fact is that, for slowly cooled systems, nature is able to find this minimum energy state. In fact, if a liquid metal is cooled quickly or quenched, it does not reach this state but rather ends up in a polycrystalline or amorphous state having higher energy. So the essence of the process is slow cooling, allowing ample time for redistribution of the atoms as they lose mobility. This is the technical definition of annealing, and it is essential for ensuring that a low energy state will be achieved [Press 1992]. More specific to computer science, simulated annealing techniques use an analogous set of controlled cooling operations for nonphysical optimization problems, in effect transforming a poor, unordered solution into a highly optimized, desirable solution. Thus, simulated annealing offers an appealing physical analogy for the solution of optimization problems, and more importantly, the potential to reshape mathematical insights from the domain of physics into insights for real optimization problems [Rutenbar 1989]. This algorithm is used for NP-hard optimization problems, as all known techniques for obtaining an exact solution require an exponentially increasing number of steps as the problems become larger [Rutenbar 1989]. The implementations of this method are strictly problem specific, and so the way in which one finds an optimization will vary from problem to problem [WFESA 2006] Highest Order Algorithm The highest order algorithm colors the graph by first identifying the connections/ edges for each vertex and coloring the vertices in the decreasing order of their connections. This algorithm belongs to the same class as the sequential algorithm which - 6 -

15 tends to give the optimal solution with minimal execution time. This algorithm does not promise a best solution always but is a good competitor to the other graph coloring algorithms Information / Decision Fusion Information Fusion strategies seek to combine the results of different classifiers or sensors to give results of a better quality for a given problem than can be achieved by any single technique alone. Information Fusion has been shown to be of benefit to a number of applications including remote sensing, personal identity recognition, target detection, forecasting, and medical diagnosis [Peacock 2001]. According to Dr. Dasarathy, in his research paper [Dasarathy 1991]; there is a distinction in the strategies of Fusion namely, Data Fusion, Feature Fusion and Decision Fusion. His explanation of the former with respect to Sensor Fusion is as follows. In general, sensor fusion can be accomplished at different levels: Data Fusion, Feature Fusion and Decision Fusion. Depending upon the specific sensors used fusion at the Data and Feature levels may or may not always be practical. For example, Data Fusion would require compatible sensors that are appropriately registered to permit such data level integration. Further, decision fusion is deemed more robust than fusion at the lower levels, as failure of one of the sensors in the sensor suite does not signify total catastrophic failure of the entire system [Dasarathy 1991]. These comparative aspects are discussed in more detail in the article [Dasarathy 1990]. In the Canonical Information Fusion architecture illustrated in Figure 1.3, a phenomenon is observed and processed by a number of individual techniques. Information from the individual techniques is combined at a Fusion Centre [Peacock - 7 -

16 2001]. The Fusion centre fuses all the data from the several techniques and outputs an application specific result, such as an integrated or optimized result. Figure 1.3 A Canonical Architecture for Information Fusion The individual techniques might be sensors or different classifiers [Peacock 2001] or any process that results a specific output which can/needs to be fused with other similar technique s output. A wide variety of fusion methods such as Committee Methods, Clustering Algorithms, Weighted Average, Techniques based on Fuzzy Set Theory, Dempster-Shafer Evidence Theory and Stochastic Methods have been proposed and implemented [Peacock 2001]. The remaining document gives detailed information about implementation of the concepts and algorithms discussed in this section. The next section gives an overview of the implementation and presents a description of the results. The system design section provides a detailed description of the implementation and the simulator developed. It describes in detail the analysis, design and implementation phases of the project. The evaluation and results section describes the results of evaluation and testing performed in the project

17 2. IMPLEMENTING THE GRAPH COLORING PROBLEM AND OPTIMIZING THE SOLUTION USING DECISION FUSION 2.1. Implementation of Algorithms The project implements the genetic algorithm, the simulated annealing algorithm, the highest order algorithm and the greedy algorithm to find a solution to the graph coloring problem. The greedy algorithm provides the initial input for the genetic algorithm and the simulated annealing algorithm. Each algorithm works differently and approaches to a solution(s) based on the given graph attributes, which are the number of vertices, the incidence matrix, the initial order of coloring and the number of colors. Each of the algorithms individually outputs a solution(s) to the given graph, not necessarily optimal. For a given graph with 100 vertices the Figure 2.1 to Figure 2.4 show the results of the individual implementation of the above mentioned algorithms. The parts (a) and (b) in the figures 2.1 to 2.4 represent the left part and the right part of the display screen respectively. The statistics show the time taken for the algorithm to run in milliseconds, the best order of vertices to color the graph and the number of colors in which the graph is colored. The figures of genetic algorithm and simulated annealing implementations show the values of the important parameters used to run these algorithms. Figure 2.1 shows the results of the greedy algorithm implementation. Circled is the number of colors required to color the graph in sequential order followed by the order of vertices

18 Figure 2.1 A Screen Shot of Greedy Algorithm Results Figure 2.2 shows the results of the highest order algorithm implementation. Circled are the execution time and the number of colors required to color the graph followed by the order of vertices. Figure 2.2 A Screen Shot of Highest Order Algorithm Results Figure 2.3 shows the results of the genetic algorithm implementation. Circled are, the number of colors required to color the graph, the execution time, the number of distinct generations created and the number of lifts generated to avoid local maxima followed by the termination condition

19 Figure 2.3 A Screen Shot of Genetic Algorithm Results Figure 2.4 shows the results of the simulated annealing algorithm implementation. Circled are, the number of colors required to color the graph, the execution time, the number of best generations stored, the start temperature, the final temperature, and the change in temperature followed by the termination condition. Figure 2.4 A Screen Shot of Simulated Annealing Algorithm Results

20 2.2. Simulator and Decision Fusion Rules The simulator executes the implementations of the algorithms, mentioned in section 2.1, in parallel. The simulator makes sure that the implementations are synchronized and the decision fusion rules are applied. The simulator applies decision fusion in the form of a set of rules or conditions to compare the progress of the simultaneous execution of algorithms and deciding on the continuation of the execution of the algorithms. Decision Fusion rules are applied to the implementations and the simulator continues the execution of the algorithm(s) which satisfy the rules. The decision fusion rules are designed such that the algorithms which satisfy them will end up in producing the optimal solution for any given graph and attributes. The simulator outputs the optimal number of colors found, the vertex order, the time taken to find the solution and the algorithm (s) that produced the solution

21 3. SYSTEM DESIGN The Java programming language is used to develop the system. The two main reasons for selecting Java as the programming language are, it is an Object Oriented Language and provides facility to execute several programs simultaneously with threads Implementation of Algorithms Sequential Algorithm The greedy algorithm, also known as the sequential algorithm, has several variations [Culberson 2002] specific to selected problems. In this project, the greedy algorithm takes each vertex in ascending order of occurrence (1.k / k being the last vertex) and tries to color the vertex with one of the colors used so far, if not possible to color it with any existing color, then a new color class is created and the vertex is assigned the color of that class. Table 3.1 provides more details of the implementation. Table 3.1 Implementation of Greedy/Sequential Algorithm Program Name: csq.java Purpose: Implements the sequential/greedy algorithm Parameters 1. Number of vertices in the graph 2. Incidence matrix of the graph Result Variables 1.sqtimeval: The time taken for execution of the sequential implementation. 2.sqvorder: The vertex order for the solution; this case it is ascending order of vertices 3.sqnumcolors: The number of colors to color the graph using the above vertex order Genetic Algorithm Genetic Algorithm (GA) presumes that the potential solution of any problem is an individual and can be represented by a set of parameters. These parameters are regarded

22 as the genes of a chromosome and can be structured by a string of values in binary form [MKF 1999]. The next level population is generated based on the process of selection and reproduction of selected individuals through genetic operators; crossover and mutation. A positive value, generally known as a fitness value, is used to reflect the degree of goodness of the chromosome or the problem which would be highly related with its objective value [MKF 1999]. The remaining section will describe the process of implementing the genetic algorithm. A representation of an individual and initial population: At each point during the process we maintain a "generation" of "individuals." Each individual is a data structure representing the "genetic structure" of a possible solution or hypothesis. Like a chromosome, the genetic structure of an individual is described using a fixed, finite alphabet. In GA, the number {0, 1} is usually used. This string is interpreted as a solution to the problem we are trying to solve [Dyer 2005]. A group of such initial solutions forms the initial population. The genetic algorithm implementation considers the order of vertices as the generation. The initial solution is taken from the sequential algorithm and the next generations are developed. Described below is an example of the initial generation. Example: Initial generation for an 8-vertex graph is A valid generation is considered to have all the vertices (from 1 to 8) in the graph. Fitness Function: Given an individual, a solution is to be assessed so that the individuals can be ranked. This is usually a real number. Unless the fitness function is handled properly, GA

23 may have a tendency to converge towards local optima rather than the global optimum of the problem [WFEGA 2006]. The fitness function in this project is the function, which finds the number of colors for a given vertex order. The quality of the generation is assessed by the number of colors required to color the graph. The less number of colors required to color the graph, the better the quality of the generation. Example: For an 8-vertex graph, the generation with which the graph can be colored using 3 colors is considered better vs. the generation with which the graph can be colored using 4 colors. Reproduction Methods The reproduction methods or genetic operators are used to form new generations. There are two basic methods of reproduction, called mutation and crossover. A mutation randomly changes one or more digits in the generation. How often to do mutation, how many digits to change, and how big a change to make are adjustable parameters [Dyer 2005]. In this project the mutation process includes changing the generation such that two vertices are randomly selected and interchanged forming a new offspring. The parameter used is the mutation rate, which signifies the number of mutation operations performed on a given generation to form a new valid offspring. The default mutation rate is 3. Example: For an 8-vertex graph, the individual may be changed by selecting two random points (2 and 7) and interchanging the vertices in that positions,

24 giving a new offspring This process is repeated three times, as the mutation rate is 3, before accepting a new generation. The crossover randomly picks one or more pairs of individuals as parents and randomly swaps segments of the parents. The rate of crossover, the number of parent pairs, the number of crossover points, and the positions of the crossover points are adjustable parameters [Dyer 2005]. In this project, for a generation two random points are selected and the segments of fixed length are swapped giving a new generation offspring. The validity of the offspring is checked by checking for existence of all the vertices. If not a valid generation, the duplicating vertices are replaced by the missing vertices making it a valid generation. The segment length of cross over is the adjustable parameter in the crossover operation. The default segment length is 5. Example: For an 8-vertex graph, the individual may be changed by selecting two random points (7 and 2) and replacing the segment starting at 2 by the segment starting at 7. The new offspring formed is The new offspring is validated and the missing vertices are replaced with the first occurrence of the repeating vertices. The new valid generation created using crossover is Selection Criteria: From a population of individuals, the fitter individuals are given a better chance to survive for the next generation. The simple criterion "keep the best individuals" is not recommended as it turns out nature does not kill all the unfit genes. They usually become recessive for a long period of time. But then they may mutate to something useful. Therefore, there is a tradeoff for better individuals and diversity [Dyer 2005]

25 This project stores the 100 best generations as the population for reproduction. Apart from this a lift methodology is used in which when ever a worst solution is generated or a randomly selected bad generation is sent for reproduction to avoid local minima. A lift variable is used as the adjustable parameter. Terminating Conditions: This generational process is repeated until a termination condition has been reached. The termination conditions used in this implementation are: 1. When Termination Flag is set forcing the GA to terminate. 2. When the same best fitness value (number of colors) is generated more than 1000 times. 3. Execution time exceeds the available time; The process above was used in a repetitive process to find an optimal solution. The next part of this section provides an algorithm that uses all the attributes defined above to form a basic implementation of genetic algorithm. Pseudo-code algorithm The following is the pseudo-code algorithm used for the project [WFEGA 2006]. 1. Choose initial population 2. Repeat 3. Evaluate the individual fitness s of a certain proportion of the population based on the fitness function 4. Select pairs of best-ranking individuals to reproduce 5. Apply crossover operator or mutation operator 6. until terminating condition Table 3.2 provides a list of the call parameters, adjustable parameters and result variables used in the implementation of the genetic algorithm

26 Table 3.2 Implementation of Genetic Algorithm Program Name: cga.java Purpose: Implements the genetic algorithm Parameters 1. Number of vertices in the graph 2. The initial generation (vertex order) from the greedy implementation 3. The fitness value of initial generation (number of colors) 2. Incidence matrix of the graph Adjustable Parameters 1.liftratio: To adjust the lifts to avoid local maxima or minima; reduce liftratio value to increase lift chances; default value is 40; 0 < liftratio < 50; 2.gamutrate: Number of times to run the mutation operation; default value is 3; 3.gacrosslen: The length of the segment to swap in crossover operation; default is 5; 4.gaopratio: The ratio to select the genetic operations; default mutation : crossover = 7:3 Result Variables 1.gatimeval: The time taken for execution of the genetic algorithm implementation. 2.gabestgen: The vertex order for the solution; 3.gabestgencolors: The number of colors to color the graph using the above vertex order 4.gennum: The number of valid distinct generations created. 5.gastorecounter: The total number of generations stored in the population colony 6.galiftcounter: Total number of lifts 7.gaopmut: Total number of mutation operations 8.gaopco: Total number of cross over operations 9.gatermin: The termination condition Simulated Annealing Algorithm Like any combinatorial optimization problem, the simulated annealing algorithm seeks to find some configuration of parameters X = (X 1, X 2 Xv), which minimizes a function f(x). This function is usually referred to as the cost or objective function; it is a measure of goodness of a particular configuration of parameters. To implement this method, in general, each point of the search space is compared to a state of some physical system, and the function E(s) to be minimized is interpreted as the internal energy of the system in that state. Therefore the goal is to bring the system, from an arbitrary initial state, to a state with the minimum possible energy [WFESA 2006]

27 In 1953, Metropolis and coworkers first incorporated the thermodynamics principles and Boltzmann probability distribution stated below into numerical calculations. Prob (E) ~ e ( E/kT) Offered a succession of options, a simulated thermodynamic system was assumed to change its configuration from energy E1 to energy E2 with probability p, p = e [ (E2 E1)/kT]. If E2 < E1, this probability is greater than unity; in such cases the change is arbitrarily assigned a probability p = 1, i.e., the system always takes such an option. This general scheme, of always taking a downhill step while sometimes taking an uphill step, is known as the Metropolis algorithm [Press 1992]. The Metropolis algorithm is used as basis for the implementation of simulated annealing algorithm. The following are defined as part of the implementation: 1. A description of possible system configurations. 2. A generator of random changes in the configuration; these changes are the options presented to the system. 3. An objective function E (analog of energy) whose minimization is the goal of the procedure. 4. A control parameter T (analog of temperature) and an annealing schedule which tells how it is lowered from high to low, e.g., after how many random changes in configuration is each downward step in T taken, and how large is that step. The meaning of high and low in this context,

28 and the assignment of a schedule, may require physical insight and/or trial-and-error experiments [Press 1992]. The implementation of simulated annealing algorithm in this project is divided into four parts based on the above mentioned metropolis algorithm. Description of System Configuration The physical system in the implementation is the graph which needs to be colored. The configuration parameters are vertices of the graph and configuration is the order of vertex in which the graph will be colored. The internal energy of the system is identified by the number of colors required to color the graph. The energy function is denoted by E. All the control parameters names start with the alphabet t. The initial configuration (vertex order) and the initial internal energy (number of colors) are obtained from the sequential algorithm implementation. Generator for Random Changes in Configuration The implementation considered three generators which randomly change the configuration. They are Inversion, Translation and Switching. Inversion works by randomly selecting two points in the configuration (vertices in this case) and reversing the order of parameters between them creating a new configuration of the system (new order of vertices). On the other hand translation works by selecting a segment of configuration and placing it randomly between other parameters resulting in a new configuration of the system. Switching works by randomly selecting two parameters in the configuration (vertices in this case) and interchanging them. After several comparisons of the three random generators Translation was used in the project. More information on the comparison is discussed in the next section of the document

29 Objective Function The objective function is denoted by E and is given by E = e ( T / CT) -1 Where T is the change in temperature and CT is the present temperature of the system. This value is compared with a randomly generated value between 0 and 1, R S. When R S < E the configuration is accepted, if not rejected. Control parameters This project considers the starting temperature, denoted by tstart and the final temperature, by tfinal as the start and end states of the physical system (graph). The change in the temperature is denoted by saalpha which denotes the percentage at which the temperature is reduced. The number of configuration changes for each temperature is denoted by the variable iterations. The start temperature tstart value is selected as 100 and the final temperature tfinal as The change in temperature saalpha is chosen as 0.9 and the iterations value is chosen as Pseudo-code Algorithm The following pseudo-code implements the simulated annealing heuristic, as described above, starting from state S 0 and continuing to a maximum of k max steps or until a state with energy E max or less is found. The call neighbour(s) should generate a randomly chosen neighbour of a given state S; the call random() should return a random value in the range [0, 1). The annealing schedule is defined by the call temp(r), which should yield the temperature to use, given the fraction r of the time budget that has been expended so far [WFESA 2006]

30 S := S 0 e := E(S) k := 0 while k < k max and e > e max S n := neighbour(s) e n := E(S n ) if (e n < e) or (random() < P(e n -e, temp(k/k max ))) then S := S n ; e := en; k := k + 1 return S Table 3.3 provides a list of the call parameters, adjustable parameters and result variables used in the implementation of the simulated annealing algorithm. Table 3.3 Implementation of Simulated Annealing Algorithm Program Name: csa.java Purpose: Implements the simulated annealing algorithm Parameters 1. Number of vertices in the graph 2. The initial generation (vertex order) from the greedy implementation 3. The fitness value of configuration parameters (number of colors) 2. Incidence matrix of the graph Adjustable Parameters 1.tstart: Initial temperature of the system to start the annealing process; tfinal: Final temperature of the system to end the process; saalpha: The change in temperature; iterations: The number of times to change configuration for each temperature; 100 Result Variables 1.satimeval: The time taken for execution of the simulated annealing algorithm implementation. 2.sabestgen: The vertex order for the solution; 3.sabestgencolors: The number of colors to color the graph using the above vertex order 4.sastorecounter: The total number of valid generations stored in the population colony 5.t : Current temperature Highest Order Algorithm In this project, the highest order algorithm orders each vertex in descending order of its degree (number of edges) and tries to color the vertex with one of the colors used so

31 far, if not possible to color it with any existing color, then a new color class is created and the vertex is assigned the color of that class. Table 3.4 provides more details of the implementation. Table 3.4 Implementation of Highest Order Algorithm Program Name: csqorder.java Purpose: Implements the highest order algorithm Parameters 1. Number of vertices in the graph 2. Incidence matrix of the graph Result Variables 1.sqotimeval: The time taken for execution of the highest order algorithm implementation. 2.sqovorder: The vertex order for the solution; this case it is descending order of degree 3.sqonumcolors: The number of colors to color the graph using the above vertex order 3.2. Decision Fusion The simulator applies the decision fusion rules to the algorithm implementations, according to the decision fusion strategy, and ends up with only one algorithm executing, which produces the optimal solution. In case no algorithm satisfies all the decision fusion rules, the solution of the last executing algorithm(s) is selected as the optimal solution. The Fusion centre fuses all the data from the several algorithms and outputs an application specific result, such as an integrated or optimized result. Among the mentioned wide variety of fusion methods a variation of the Committee method will be used. The next part of this section gives a detailed description of the committee methods Committee Method This method emphasizes on enabling the different information sources to contribute to a result. It is a Vote based decision fusion method which group individual experts or discriminating functions into a set termed a Committee. In this approach, the

32 individual experts/discriminating functions cast votes for the correct hypothesis. [Peacock 2001] In this project the decision fusion rules will be the main criteria used by the simulator to distinguish the several algorithms and their execution. A combination of the rank and value based committee methods are used. A set of rules have been prepared based on the minimum number of colors used to color the graph, time of execution and the attributes (vertices, edges) of the given graph. The decision fusion rules are: 1. Accepted time of execution (ATE) for the simulator. The user decides on the value of ATE and enters it manually at the start of the execution of the simulator. After the ATE is elapsed the algorithms are forced to complete and the current best solution is selected. 2. After every 5% of the ATE, the best solution by each algorithm is recorded. For a given complete execution of the simulator the best solutions are recorded for a maximum of 20 times. Starting from 50%, every 5% of the ATE the algorithm with average number of colors or average solutions greater than the sum of the total average solutions by all the algorithms and an auxiliary value is suspended. The auxiliary value is determined based on the difference between the average of solutions and the final solution. See section Every 5% of the ATE, a voting is performed on the algorithms based on the current best solution generated so far. For a given complete execution of the simulator the voting is performed for a maximum of 20 times. At every voting performed, the algorithm with the best solution is given a vote value of 1 and

33 the other algorithm is given a vote value of 0. When both the algorithms have the best solution a vote value of 1 is given to both. Starting from 50%, every 5% of the ATE, the algorithm with a difference percentage (dp) value greater than a threshold value is suspended. The difference percentage (dp) value is calculated using the formula below. dp = ( votega votesa * 100)/sc dp is the difference percentage. votega and votesa are the total number of votes received by genetic algorithm and simulated annealing algorithm until then respectively. sc is the total number of votes polled so far. The threshold value is determined by comparing the average votes polled for each algorithm with the final algorithm which generates the best solution. See section Optimizing the Solution The simulator executes the algorithms simultaneously and will use the decision fusion rules prepared to decide on the future of each algorithm execution. The decision fusion rules will be the main criteria used by the simulator to distinguish the several algorithms and their execution. If any algorithm does not satisfy a rule, it will be rejected and the algorithm will stop running on the simulator. The rest of the algorithms will continue running. This will continue until one algorithm results in the optimal solution. The simulator is written in Java. Table 3.5 provides more information on the simulator program

34 Table 3.5 Simulator Program Name: mainproject.java Purpose: Simulator which includes the implementation of all algorithms Parameters 1. Available time of execution 2. Size of the graph Result Variables 1.overallTime: The time taken for execution of the simulator. 2.overallbest: The best vertex order of all the implementations 3.overallbestcolors: The number of colors to color the graph using the above vertex order 4.overallbestalgo: The algorithm from which solution is accepted. 5.gavote, savote, sqovote, sqvote: The parameters for voting; stores the number of votes for each algorithm 6.gaav, saav, sqoav, sqav, totalav: variables which store the average number of colors at regular intervals 7.gaperc, saperc, sqoperc: Stores the percentage of success for each algorithm

35 4. EVALUATION AND RESULTS This chapter describes the reasonableness and usability of the project. A set of testing schemes will be used to see if the whole project is performing the tasks as expected. It is followed by a series of evaluation procedures to test the implementation of genetic, simulated annealing, highest order and sequential algorithms and their integration in the simulator Evaluation and Testing Testing the Implementation of Algorithms The genetic algorithm, simulated annealing algorithm, highest order algorithm and the sequential/greedy algorithm are implemented according to the procedure and pseudo-code mentioned in the above section. The algorithms were tested for consistency with the pseudo-code and checked for compile time and run time errors. Each of the implementation was tested to accept the correct number of parameters. Then each of the implementation was tested to check if they can produce a solution to a given graph coloring problem, not necessarily optimal. The figure 4.1 is a screen shot for testing the code to randomly generate a valid incidence matrix given the number of vertices. The test program is successfully run 2 times randomly generating the incidence matrix. The parts (a) and (b) identifies the 2 test runs of the program with 5 and 4 vertices respectively. The number of vertices is identified in a circle. The invalid incidence matrix in identified in an oval and the valid incidence matrix generated are identified in a rectangle

36 Figure 4.1 A Screen Shot to Test the Incidence Matrix Generation The Figures 4.2 and 4.3 are screen shots of the tests performed on the implementation of sequential/ greedy algorithm. The order of vertices to color the graph is the increasing order of vertices (1, 2, 3..k/ k is the last vertex) which will be generated automatically by the implementation. Figure 4.2 shows the test run on the greedy implementation with an 8-vertex predefined graph. The number of vertices, the incidence matrix, the coloring of the graph, the number of colors to color the graph and the execution time are identified in an oval

37 Figure 4.2 A Screen Shot to Test Greedy Algorithm Implementation Figure 4.3 shows the test run of the greedy implementation with a 100-vertex random generated graph. The order of vertices, the corresponding number of colors to color the graph and the execution time are identified in an oval. Part (b) is the continuation for the display of part (a) onto the right side. Figure 4.3 A Screen Shot to Test Greedy Algorithm Implementation The Figures 4.4 and 4.5 are screen shots of the tests performed on the implementation of highest order algorithm. The order of vertices to color the graph is the

38 increasing order of degree for each vertex which will be generated automatically by the implementation. The Figure 4.4 shows the test run on the highest order implementation with an 8- vertex predefined graph. The number of vertices and the incidence matrix are identified by an oval pointed by the arrow marks.. The number of colors to color the graph and execution time is identified in a rectangle. Figure 4.4 A Screen Shot to Test Highest Order Algorithm Implementation Figure 4.5 shows the test run of the highest order implementation with a 100- vertex random generated graph. The number of colors to color the graph and the execution are identified in circles. Part (b) is the continuation for the display of part (a) onto the right side

39 Figure 4.5 A Screen Shot to Test Greedy Algorithm Implementation The Figures 4.6, 4.7 and 4.8 are screen shots of the tests performed on the implementation of genetic algorithm. The order of vertices to color the graph and the corresponding number of colors to color a graph are taken from the sequential algorithm implementation. Figure 4.6 shows the unit testing of mutation operation in genetic algorithm implementation. The mutation rate, the random positions to interchange in mutation operation and the valid mutation operation are identified in oval boxes. The invalid result generated by mutation operation is shown in a rectangle. When an invalid result is generated the mutation is performed again until a valid result is obtained. Figure 4.6 A Screen Shot to Test Mutation Operation in Genetic Algorithm

40 Figure 4.7 shows the unit testing of crossover operation in genetic algorithm implementation. The crossover schemata length and the random points to interchange are identified in oval boxes. A segment of crossover schemata length starting from the first random point replaces the vertices at the second random point. The resulting offspring may be an invalid generation. When an invalid result is generated the repeating vertices are identified and replaced by the missing vertices making it a valid generation. The invalid result generated after the genetic operation is identified in a rectangle with an arrow. The valid generation after crossover operation is identified in an oval box. Figure 4.7 A Screen Shot to Test Mutation Operation in Genetic Algorithm Figure 4.8 shows the testing on implementation of the genetic algorithm. The test is run on a 100-vertex randomly generated graph. The best generation, the corresponding number of colors to color the graph, the time of execution, number of distinct generations created, number of lifts and the termination conditions are identified in oval boxes. Part (b) is the continuation for the display of part (a) onto the right side

41 Figure 4.8 A Screen Shot to Test Genetic Algorithm Implementation The Figures 4.9, 4.10 and 4.11 are screen shots of the tests performed on the implementation of simulated annealing algorithm. The order of vertices to color the graph and the corresponding colors to color the graph are taken from the sequential algorithm. Figure 4.9 shows the unit testing of inversion operation in the simulated annealing algorithm implementation. The test program is successfully run 5 times performing the inversion operation on a predefined vertex order. In each case the random positions between which the inversion is performed and the changed valid generations are identified in oval boxes

42 Figure 4.9 A Screen Shot to Test Inversion Operation in Simulated Annealing Figure 4.10 shows the unit testing of translation operation in the simulated annealing implementation. The test program is successfully run 3 times performing the translation operation on a predefined vertex order. In each case the random positions between which the translation is performed and the changed valid generations are identified in oval boxes. Figure 4.10 A Screen Shot to Test Translation Operation in Simulated Annealing Figure 4.11 shows the testing on the implementation of the simulated annealing algorithm. The test is run on a 100-vertex randomly generated graph. The best generation, the corresponding number of colors to color the graph, the time of execution, number of generations stored, start temperature, final temperature, change in temperature and the

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Evolutionary Computation Algorithms for Cryptanalysis: A Study Evolutionary Computation Algorithms for Cryptanalysis: A Study Poonam Garg Information Technology and Management Dept. Institute of Management Technology Ghaziabad, India pgarg@imt.edu Abstract The cryptanalysis

More information

Introduction to Design Optimization: Search Methods

Introduction to Design Optimization: Search Methods Introduction to Design Optimization: Search Methods 1-D Optimization The Search We don t know the curve. Given α, we can calculate f(α). By inspecting some points, we try to find the approximated shape

More information

GENETIC ALGORITHM with Hands-On exercise

GENETIC ALGORITHM with Hands-On exercise GENETIC ALGORITHM with Hands-On exercise Adopted From Lecture by Michael Negnevitsky, Electrical Engineering & Computer Science University of Tasmania 1 Objective To understand the processes ie. GAs Basic

More information

Non-deterministic Search techniques. Emma Hart

Non-deterministic Search techniques. Emma Hart Non-deterministic Search techniques Emma Hart Why do local search? Many real problems are too hard to solve with exact (deterministic) techniques Modern, non-deterministic techniques offer ways of getting

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 Outline Local search techniques and optimization Hill-climbing

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Informed Search and Exploration Chapter 4 (4.3 4.6) Searching: So Far We ve discussed how to build goal-based and utility-based agents that search to solve problems We ve also presented

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 2 Outline Local search techniques and optimization Hill-climbing

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 2 Outline Local search techniques and optimization Hill-climbing

More information

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications

More information

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems FIFTH INTERNATIONAL CONFERENCE ON HYDROINFORMATICS 1-5 July 2002, Cardiff, UK C05 - Evolutionary algorithms in hydroinformatics An evolutionary annealing-simplex algorithm for global optimisation of water

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2018 Beyond Classical Search Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Local Search Vibhav Gogate The University of Texas at Dallas Some material courtesy of Luke Zettlemoyer, Dan Klein, Dan Weld, Alex Ihler, Stuart Russell, Mausam Systematic Search:

More information

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat: Local Search Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat: Select a variable to change Select a new value for that variable Until a satisfying assignment is

More information

Introduction to Genetic Algorithms

Introduction to Genetic Algorithms Advanced Topics in Image Analysis and Machine Learning Introduction to Genetic Algorithms Week 3 Faculty of Information Science and Engineering Ritsumeikan University Today s class outline Genetic Algorithms

More information

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search A JOB-SHOP SCHEDULING PROBLEM (JSSP) USING GENETIC ALGORITHM (GA) Mahanim Omar, Adam Baharum, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains Malaysia 11800 Penang, Malaysia Tel: (+)

More information

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7. Chapter 7: Derivative-Free Optimization Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.5) Jyh-Shing Roger Jang et al.,

More information

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable. Hill Climbing Many search spaces are too big for systematic search. A useful method in practice for some consistency and optimization problems is hill climbing: Assume a heuristic value for each assignment

More information

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman Lecture 4 3B1B Optimization Michaelmas 2017 A. Zisserman Convexity Robust cost functions Optimizing non-convex functions grid search branch and bound simulated annealing evolutionary optimization The Optimization

More information

An Introduction to Evolutionary Algorithms

An Introduction to Evolutionary Algorithms An Introduction to Evolutionary Algorithms Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology Karthik.sindhya@jyu.fi http://users.jyu.fi/~kasindhy/

More information

Algorithms & Complexity

Algorithms & Complexity Algorithms & Complexity Nicolas Stroppa - nstroppa@computing.dcu.ie CA313@Dublin City University. 2006-2007. November 21, 2006 Classification of Algorithms O(1): Run time is independent of the size of

More information

Optimization Techniques for Design Space Exploration

Optimization Techniques for Design Space Exploration 0-0-7 Optimization Techniques for Design Space Exploration Zebo Peng Embedded Systems Laboratory (ESLAB) Linköping University Outline Optimization problems in ERT system design Heuristic techniques Simulated

More information

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery Monika Sharma 1, Deepak Sharma 2 1 Research Scholar Department of Computer Science and Engineering, NNSS SGI Samalkha,

More information

CHAPTER 4 GENETIC ALGORITHM

CHAPTER 4 GENETIC ALGORITHM 69 CHAPTER 4 GENETIC ALGORITHM 4.1 INTRODUCTION Genetic Algorithms (GAs) were first proposed by John Holland (Holland 1975) whose ideas were applied and expanded on by Goldberg (Goldberg 1989). GAs is

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

Global Optimization. for practical engineering applications. Harry Lee 4/9/2018 CEE 696

Global Optimization. for practical engineering applications. Harry Lee 4/9/2018 CEE 696 Global Optimization for practical engineering applications Harry Lee 4/9/2018 CEE 696 Table of contents 1. Global Optimization 1 Global Optimization Global optimization Figure 1: Fig 2.2 from Nocedal &

More information

Algorithm Design (4) Metaheuristics

Algorithm Design (4) Metaheuristics Algorithm Design (4) Metaheuristics Takashi Chikayama School of Engineering The University of Tokyo Formalization of Constraint Optimization Minimize (or maximize) the objective function f(x 0,, x n )

More information

Research Incubator: Combinatorial Optimization. Dr. Lixin Tao December 9, 2003

Research Incubator: Combinatorial Optimization. Dr. Lixin Tao December 9, 2003 Research Incubator: Combinatorial Optimization Dr. Lixin Tao December 9, 23 Content General Nature of Research on Combinatorial Optimization Problem Identification and Abstraction Problem Properties and

More information

SIMULATED ANNEALING TECHNIQUES AND OVERVIEW. Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA

SIMULATED ANNEALING TECHNIQUES AND OVERVIEW. Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA SIMULATED ANNEALING TECHNIQUES AND OVERVIEW Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA 1. INTRODUCTION Simulated annealing is a global optimization algorithm

More information

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search Outline Genetic Algorithm Motivation Genetic algorithms An illustrative example Hypothesis space search Motivation Evolution is known to be a successful, robust method for adaptation within biological

More information

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II Genetic Algorithms Genetic Algorithms Iterative method for doing optimization Inspiration from biology General idea (see Pang or Wikipedia for more details): Create a collection of organisms/individuals

More information

Chapter 14 Global Search Algorithms

Chapter 14 Global Search Algorithms Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.

More information

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis and Dimensioning II Department of Electronics and Communications Engineering Tampere University of Technology, Tampere, Finland March 19, 2014

More information

Information Fusion Dr. B. K. Panigrahi

Information Fusion Dr. B. K. Panigrahi Information Fusion By Dr. B. K. Panigrahi Asst. Professor Department of Electrical Engineering IIT Delhi, New Delhi-110016 01/12/2007 1 Introduction Classification OUTLINE K-fold cross Validation Feature

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? Gurjit Randhawa Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? This would be nice! Can it be done? A blind generate

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Approximation Algorithms and Heuristics November 6, 2015 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff INRIA Lille Nord Europe 2 Exercise: The Knapsack Problem

More information

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ OUTLINE Optimisation problems Optimisation & search Two Examples The knapsack problem

More information

Genetic Algorithm for Circuit Partitioning

Genetic Algorithm for Circuit Partitioning Genetic Algorithm for Circuit Partitioning ZOLTAN BARUCH, OCTAVIAN CREŢ, KALMAN PUSZTAI Computer Science Department, Technical University of Cluj-Napoca, 26, Bariţiu St., 3400 Cluj-Napoca, Romania {Zoltan.Baruch,

More information

AI Programming CS S-08 Local Search / Genetic Algorithms

AI Programming CS S-08 Local Search / Genetic Algorithms AI Programming CS662-2013S-08 Local Search / Genetic Algorithms David Galles Department of Computer Science University of San Francisco 08-0: Overview Local Search Hill-Climbing Search Simulated Annealing

More information

Genetic Programming. Charles Chilaka. Department of Computational Science Memorial University of Newfoundland

Genetic Programming. Charles Chilaka. Department of Computational Science Memorial University of Newfoundland Genetic Programming Charles Chilaka Department of Computational Science Memorial University of Newfoundland Class Project for Bio 4241 March 27, 2014 Charles Chilaka (MUN) Genetic algorithms and programming

More information

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY GENETIC ALGORITHM FOR OPTIMIZATION PROBLEMS C. Premalatha Assistant Professor, Department of Information Technology Sri Ramakrishna

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Part 10: Genetic Algorithm Basics Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic

More information

Escaping Local Optima: Genetic Algorithm

Escaping Local Optima: Genetic Algorithm Artificial Intelligence Escaping Local Optima: Genetic Algorithm Dae-Won Kim School of Computer Science & Engineering Chung-Ang University We re trying to escape local optima To achieve this, we have learned

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Approximation Algorithms and Heuristics November 21, 2016 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff Inria Saclay Ile-de-France 2 Exercise: The Knapsack

More information

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing Solving Traveling Salesman Problem Using Parallel Genetic Algorithm and Simulated Annealing Fan Yang May 18, 2010 Abstract The traveling salesman problem (TSP) is to find a tour of a given number of cities

More information

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach 1 Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach David Greiner, Gustavo Montero, Gabriel Winter Institute of Intelligent Systems and Numerical Applications in Engineering (IUSIANI)

More information

Graph Coloring Algorithms for Assignment Problems in Radio Networks

Graph Coloring Algorithms for Assignment Problems in Radio Networks c 1995 by Lawrence Erlbaum Assoc. Inc. Pub., Hillsdale, NJ 07642 Applications of Neural Networks to Telecommunications 2 pp. 49 56 (1995). ISBN 0-8058-2084-1 Graph Coloring Algorithms for Assignment Problems

More information

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN 97 CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN 5.1 INTRODUCTION Fuzzy systems have been applied to the area of routing in ad hoc networks, aiming to obtain more adaptive and flexible

More information

x n+1 = x n f(x n) f (x n ), (1)

x n+1 = x n f(x n) f (x n ), (1) 1 Optimization The field of optimization is large and vastly important, with a deep history in computer science (among other places). Generally, an optimization problem is defined by having a score function

More information

Term Paper for EE 680 Computer Aided Design of Digital Systems I Timber Wolf Algorithm for Placement. Imran M. Rizvi John Antony K.

Term Paper for EE 680 Computer Aided Design of Digital Systems I Timber Wolf Algorithm for Placement. Imran M. Rizvi John Antony K. Term Paper for EE 680 Computer Aided Design of Digital Systems I Timber Wolf Algorithm for Placement By Imran M. Rizvi John Antony K. Manavalan TimberWolf Algorithm for Placement Abstract: Our goal was

More information

The k-means Algorithm and Genetic Algorithm

The k-means Algorithm and Genetic Algorithm The k-means Algorithm and Genetic Algorithm k-means algorithm Genetic algorithm Rough set approach Fuzzy set approaches Chapter 8 2 The K-Means Algorithm The K-Means algorithm is a simple yet effective

More information

Administrative. Local Search!

Administrative. Local Search! Administrative Local Search! CS311 David Kauchak Spring 2013 Assignment 2 due Tuesday before class Written problems 2 posted Class participation http://www.youtube.com/watch? v=irhfvdphfzq&list=uucdoqrpqlqkvctckzqa

More information

Copyright 2000, Kevin Wayne 1

Copyright 2000, Kevin Wayne 1 Guessing Game: NP-Complete? 1. LONGEST-PATH: Given a graph G = (V, E), does there exists a simple path of length at least k edges? YES. SHORTEST-PATH: Given a graph G = (V, E), does there exists a simple

More information

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm 2011 International Conference on Software and Computer Applications IPCSIT vol.9 (2011) (2011) IACSIT Press, Singapore Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm Roshni

More information

Evolutionary Computation. Chao Lan

Evolutionary Computation. Chao Lan Evolutionary Computation Chao Lan Outline Introduction Genetic Algorithm Evolutionary Strategy Genetic Programming Introduction Evolutionary strategy can jointly optimize multiple variables. - e.g., max

More information

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell Introduction to Genetic Algorithms Based on Chapter 10 of Marsland Chapter 9 of Mitchell Genetic Algorithms - History Pioneered by John Holland in the 1970s Became popular in the late 1980s Based on ideas

More information

Ar#ficial)Intelligence!!

Ar#ficial)Intelligence!! Introduc*on! Ar#ficial)Intelligence!! Roman Barták Department of Theoretical Computer Science and Mathematical Logic We know how to use heuristics in search BFS, A*, IDA*, RBFS, SMA* Today: What if the

More information

INF Biologically inspired computing Lecture 1: Marsland chapter 9.1, Optimization and Search Jim Tørresen

INF Biologically inspired computing Lecture 1: Marsland chapter 9.1, Optimization and Search Jim Tørresen INF3490 - Biologically inspired computing Lecture 1: Marsland chapter 9.1, 9.4-9.6 2017 Optimization and Search Jim Tørresen Optimization and Search 2 Optimization and Search Methods (selection) 1. Exhaustive

More information

N-Queens problem. Administrative. Local Search

N-Queens problem. Administrative. Local Search Local Search CS151 David Kauchak Fall 2010 http://www.youtube.com/watch?v=4pcl6-mjrnk Some material borrowed from: Sara Owsley Sood and others Administrative N-Queens problem Assign 1 grading Assign 2

More information

Mutations for Permutations

Mutations for Permutations Mutations for Permutations Insert mutation: Pick two allele values at random Move the second to follow the first, shifting the rest along to accommodate Note: this preserves most of the order and adjacency

More information

A Genetic Algorithm for Multiprocessor Task Scheduling

A Genetic Algorithm for Multiprocessor Task Scheduling A Genetic Algorithm for Multiprocessor Task Scheduling Tashniba Kaiser, Olawale Jegede, Ken Ferens, Douglas Buchanan Dept. of Electrical and Computer Engineering, University of Manitoba, Winnipeg, MB,

More information

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA The Binary Genetic Algorithm Universidad de los Andes-CODENSA 1. Genetic Algorithms: Natural Selection on a Computer Figure 1 shows the analogy between biological i l evolution and a binary GA. Both start

More information

Random Search Report An objective look at random search performance for 4 problem sets

Random Search Report An objective look at random search performance for 4 problem sets Random Search Report An objective look at random search performance for 4 problem sets Dudon Wai Georgia Institute of Technology CS 7641: Machine Learning Atlanta, GA dwai3@gatech.edu Abstract: This report

More information

Dr. Stephan Steigele Vorlesung im SS 2008

Dr. Stephan Steigele Vorlesung im SS 2008 Dr. Vorlesung im SS 2008 Bioinf / Universität Leipzig Recombination issues Genetic algorithms Holland s original GA is now known as the simple genetic algorithm (SGA) Other GAs use different: Representations

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Revision Lecture Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic Optimisation University

More information

A Parallel Architecture for the Generalized Traveling Salesman Problem

A Parallel Architecture for the Generalized Traveling Salesman Problem A Parallel Architecture for the Generalized Traveling Salesman Problem Max Scharrenbroich AMSC 663 Project Proposal Advisor: Dr. Bruce L. Golden R. H. Smith School of Business 1 Background and Introduction

More information

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation

More information

Computational Intelligence

Computational Intelligence Computational Intelligence Module 6 Evolutionary Computation Ajith Abraham Ph.D. Q What is the most powerful problem solver in the Universe? ΑThe (human) brain that created the wheel, New York, wars and

More information

Simulated annealing/metropolis and genetic optimization

Simulated annealing/metropolis and genetic optimization Simulated annealing/metropolis and genetic optimization Eugeniy E. Mikhailov The College of William & Mary Lecture 18 Eugeniy Mikhailov (W&M) Practical Computing Lecture 18 1 / 8 Nature s way to find a

More information

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi

More information

The Genetic Algorithm for finding the maxima of single-variable functions

The Genetic Algorithm for finding the maxima of single-variable functions Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding

More information

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Evolutionary Algorithms. CS Evolutionary Algorithms 1 Evolutionary Algorithms CS 478 - Evolutionary Algorithms 1 Evolutionary Computation/Algorithms Genetic Algorithms l Simulate natural evolution of structures via selection and reproduction, based on performance

More information

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Necmettin Kaya Uludag University, Mechanical Eng. Department, Bursa, Turkey Ferruh Öztürk Uludag University, Mechanical Eng. Department,

More information

Evolutionary Computation Part 2

Evolutionary Computation Part 2 Evolutionary Computation Part 2 CS454, Autumn 2017 Shin Yoo (with some slides borrowed from Seongmin Lee @ COINSE) Crossover Operators Offsprings inherit genes from their parents, but not in identical

More information

A Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron

A Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron A Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron Kiat-Choong Chen Ian Hsieh Cao An Wang Abstract A minimum tetrahedralization of a convex polyhedron is a partition of the convex

More information

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution

More information

Neural Network Weight Selection Using Genetic Algorithms

Neural Network Weight Selection Using Genetic Algorithms Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks

More information

Machine Learning for Software Engineering

Machine Learning for Software Engineering Machine Learning for Software Engineering Single-State Meta-Heuristics Prof. Dr.-Ing. Norbert Siegmund Intelligent Software Systems 1 2 Recap: Goal is to Find the Optimum Challenges of general optimization

More information

Optimization in Brachytherapy. Gary A. Ezzell, Ph.D. Mayo Clinic Scottsdale

Optimization in Brachytherapy. Gary A. Ezzell, Ph.D. Mayo Clinic Scottsdale Optimization in Brachytherapy Gary A. Ezzell, Ph.D. Mayo Clinic Scottsdale Outline General concepts of optimization Classes of optimization techniques Concepts underlying some commonly available methods

More information

Genetic Algorithms and Genetic Programming Lecture 7

Genetic Algorithms and Genetic Programming Lecture 7 Genetic Algorithms and Genetic Programming Lecture 7 Gillian Hayes 13th October 2006 Lecture 7: The Building Block Hypothesis The Building Block Hypothesis Experimental evidence for the BBH The Royal Road

More information

Lecture 6: The Building Block Hypothesis. Genetic Algorithms and Genetic Programming Lecture 6. The Schema Theorem Reminder

Lecture 6: The Building Block Hypothesis. Genetic Algorithms and Genetic Programming Lecture 6. The Schema Theorem Reminder Lecture 6: The Building Block Hypothesis 1 Genetic Algorithms and Genetic Programming Lecture 6 Gillian Hayes 9th October 2007 The Building Block Hypothesis Experimental evidence for the BBH The Royal

More information

Similarity Templates or Schemata. CS 571 Evolutionary Computation

Similarity Templates or Schemata. CS 571 Evolutionary Computation Similarity Templates or Schemata CS 571 Evolutionary Computation Similarities among Strings in a Population A GA has a population of strings (solutions) that change from generation to generation. What

More information

Time Complexity Analysis of the Genetic Algorithm Clustering Method

Time Complexity Analysis of the Genetic Algorithm Clustering Method Time Complexity Analysis of the Genetic Algorithm Clustering Method Z. M. NOPIAH, M. I. KHAIRIR, S. ABDULLAH, M. N. BAHARIN, and A. ARIFIN Department of Mechanical and Materials Engineering Universiti

More information

Automatic Generation of Test Case based on GATS Algorithm *

Automatic Generation of Test Case based on GATS Algorithm * Automatic Generation of Test Case based on GATS Algorithm * Xiajiong Shen and Qian Wang Institute of Data and Knowledge Engineering Henan University Kaifeng, Henan Province 475001, China shenxj@henu.edu.cn

More information

CHAPTER 1 at a glance

CHAPTER 1 at a glance CHAPTER 1 at a glance Introduction to Genetic Algorithms (GAs) GA terminology Genetic operators Crossover Mutation Inversion EDA problems solved by GAs 1 Chapter 1 INTRODUCTION The Genetic Algorithm (GA)

More information

Simplicial Global Optimization

Simplicial Global Optimization Simplicial Global Optimization Julius Žilinskas Vilnius University, Lithuania September, 7 http://web.vu.lt/mii/j.zilinskas Global optimization Find f = min x A f (x) and x A, f (x ) = f, where A R n.

More information

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007)

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007) In the name of God Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm Spring 2009 Instructor: Dr. Masoud Yaghini Outlines Problem Definition Modeling As A Set Partitioning

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Information Systems and Machine Learning Lab (ISMLL) Tomáš Horváth 10 rd November, 2010 Informed Search and Exploration Example (again) Informed strategy we use a problem-specific

More information

HEURISTICS FOR THE NETWORK DESIGN PROBLEM

HEURISTICS FOR THE NETWORK DESIGN PROBLEM HEURISTICS FOR THE NETWORK DESIGN PROBLEM G. E. Cantarella Dept. of Civil Engineering University of Salerno E-mail: g.cantarella@unisa.it G. Pavone, A. Vitetta Dept. of Computer Science, Mathematics, Electronics

More information

The Parallel Software Design Process. Parallel Software Design

The Parallel Software Design Process. Parallel Software Design Parallel Software Design The Parallel Software Design Process Deborah Stacey, Chair Dept. of Comp. & Info Sci., University of Guelph dastacey@uoguelph.ca Why Parallel? Why NOT Parallel? Why Talk about

More information

Beyond Classical Search

Beyond Classical Search Beyond Classical Search Chapter 3 covered problems that considered the whole search space and produced a sequence of actions leading to a goal. Chapter 4 covers techniques (some developed outside of AI)

More information

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search Introduction to Artificial Intelligence 2 nd semester 2016/2017 Chapter 4: Beyond Classical Search Mohamed B. Abubaker Palestine Technical College Deir El-Balah 1 Outlines local search algorithms and optimization

More information

Active contour: a parallel genetic algorithm approach

Active contour: a parallel genetic algorithm approach id-1 Active contour: a parallel genetic algorithm approach Florence Kussener 1 1 MathWorks, 2 rue de Paris 92196 Meudon Cedex, France Florence.Kussener@mathworks.fr Abstract This paper presents an algorithm

More information

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function Introduction METAHEURISTICS Some problems are so complicated that are not possible to solve for an optimal solution. In these problems, it is still important to find a good feasible solution close to the

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Lesson 4 Local Search Local improvement, no paths Look around at states in the local neighborhood and choose the one with the best value Pros: Quick (usually linear) Sometimes enough

More information

Local Search (Ch )

Local Search (Ch ) Local Search (Ch. 4-4.1) Local search Before we tried to find a path from the start state to a goal state using a fringe set Now we will look at algorithms that do not care about a fringe, but just neighbors

More information

Using Genetic Algorithms in Integer Programming for Decision Support

Using Genetic Algorithms in Integer Programming for Decision Support Doi:10.5901/ajis.2014.v3n6p11 Abstract Using Genetic Algorithms in Integer Programming for Decision Support Dr. Youcef Souar Omar Mouffok Taher Moulay University Saida, Algeria Email:Syoucef12@yahoo.fr

More information

Grid Scheduling Strategy using GA (GSSGA)

Grid Scheduling Strategy using GA (GSSGA) F Kurus Malai Selvi et al,int.j.computer Technology & Applications,Vol 3 (5), 8-86 ISSN:2229-693 Grid Scheduling Strategy using GA () Dr.D.I.George Amalarethinam Director-MCA & Associate Professor of Computer

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information