for Locating Apoptotic Cellular Automata

Size: px
Start display at page:

Download "for Locating Apoptotic Cellular Automata"

Transcription

1 A Multi-objective Optimization Nested Evolutionary Algorithm for Locating Apoptotic Cellular Automata by Carolyn Pugh A Thesis Presented to The University of Guelph In partial fulfilment of requirements for the degree of Master of Science in Mathematics Guelph, Ontario, Canada c Carolyn Pugh, April, 2014

2 ABSTRACT A MULTI-OBJECTIVE OPTIMIZATION NESTED EVOLUTIONARY ALGORITHM FOR LOCATING APOPTOTIC CELLULAR AUTOMATA Carolyn Pugh University of Guelph, 2014 Advisor: Dr. Daniel Ashlock Real world decisions frequently involve the consideration of multiple, often conflicting, factors. These problems usually have more than one optimal solution. Multiobjective Optimization Evolutionary Algorithms attempt to solve such problems by finding as many optimal solutions as possible at one time. This thesis proposes a new nested algorithm, Multi-objective Optimization Nested Evolutionary Algorithm (MONEA), that aims to solve multi-objective optimization problems by breaking them down into single-objective optimization problems with a new type of function. It tests MONEA on two different problems and compares the results with a well known algorithm, NSGA-II. The first problem focuses on testing MONEA s abilities for a trivial set of solutions and the second serves as an advanced problem that has unknown solutions.

3 Acknowledgments I would like to express my gratitude to my advisor, Dr. Daniel Ashlock, for his support and guidance throughout this project. His knowledge proved invaluable in the face of my countless questions and his many areas of research have helped to broaden my own. I would also like to thank my advisory committee, Professors Willms and McNicholas for their time and expertise. In addition, I would like to thank my fellow members of the Ashlock Lab for welcoming me and assisting me throughout my time as a graduate student. I am grateful for their expertise and indulgence while I completed my studies. Acknowledgments also go to my family, for their patience and support throughout this process. I would also like to extend my appreciation to the Ontario Graduate Scholarship (OGS) Program for its financial support of this project. iii

4 Contents 1 Introduction 1 2 Background Information and Literature Review Preface Background Information Evolutionary Computation Multi-objective Optimization Genetic Programming Cellular Automata Literature Review Concept Basic Setup Algorithm Overview Parameter Testing and Fine Tuning with a Trivial Test Function Initial Test Function Parameter Testing Setup Results and Discussion Comparison with NSGA-II Fair Comparison Issue iv

5 CONTENTS Setup v Results and Discussion Testing on Cellular Automata Problem Description Fair Comparison Issue Setup Results and Discussion Conclusions and Future Work Conclusions Future Work

6 List of Figures 1.1 Domination and the Pareto Frontier for Two Objectives Evolutionary Algorithm Pseudocode Two-point Crossover on Binary Strings A Function Stack with 12 Nodes Apoptotic Cellular Automata Time History Non-domination Ranking Process MONEA Pseudocode MONEA Conceptual Mapping Pareto Frontier for Two Objectives with Weighted Average Pareto Frontier for Two Objectives with PM Functions Visual Comparison of MONEA and NSGA-II Version Visual Comparison of MONEA and NSGA-II Version Cellular Automata for All 63 Seeds and a Given Rule Overall Survivor Statistic Averages for MONEA and NSGA-II Algorithm Survivor Statistic Averages for MONEA and NSGA-II Average Rank of Solutions from NSGA-II Average Number of Non-dominated Solutions from MONEA and NSGA-II Cellular Automata for All 63 Seeds and a Given Rule vi

7 LIST OF FIGURES vii 5.7 Cellular Automata for All 63 Seeds and a Given Rule Cellular Automata for All 63 Seeds and a Given Rule Cellular Automata for All 63 Seeds and a Given Rule Cellular Automata for All 63 Seeds and a Given Rule Example Fire and Ice Maze

8 List of Tables 3.1 Intake Node Functions used in MONEA Optimal Parameters for MONEA Survival Statistics MONEA Survival Statistics for Two Objectives NSGA-II Survival Statistics for Two Objectives MONEA Survival Statistics for Three Objectives NSGA-II Survival Statistics for Three Objectives MONEA Survival Statistics for Six Objectives NSGA-II Survival Statistics for Six Objectives State to Colour Assignments for Cellular Automata viii

9 Chapter 1 Introduction In the real world, when one wants to make a decision, multiple factors must usually be considered. When one wants to purchase a car, for example, she typically wishes to find a vehicle which is inexpensive, but also one which is solidly made and will not require a lot of repairs. Things like power and fuel efficiency are also possible considerations. Now, it is more than likely that the cheapest car will not satisfy many of the other requirements, and so would not be a good choice, thus forcing one to compromise and consider slightly more expensive cars that are less likely to break down and that get better fuel efficiency. Without further specifications on exactly what the buyer wants in her car, she will be able to find multiple options that are essentially incomparable. For instance, consider two cars, A and B. Car A is two percent more fuel efficient compared to car B, but car B is one thousand dollars cheaper than car A. As far as she can determine, these cars are all just as good options as each other, and here lies the crux of the problem. The more factors, or objectives, that must be considered, the more possible solutions will exist. On top of this, these objectives often come into conflict with one another, i.e. a car with more power is generally more expensive, adding further complexity to the problem. These types of problems are called Multi-objective Optimization Problems (MOPs). Vilfredo Pareto is largely credited with laying the groundwork for MOPs, which were first addressed in the context of economics in the late nineteenth century [9] [22]. For 1

10 2 Figure 1.1: An illustration of domination and the Pareto Frontier for two objectives. this reason, the set of optimal solutions for a problem is often referred to as the Pareto Frontier. Solutions in this set are referred to as non-dominated solutions. One solution will dominate another if it is better than the other solution in terms of all objectives. If for two solutions a and b and two objectives, a is better than b in one objective, but b is better than a in the other, the two solutions are considered equivalent in terms of domination. This is illustrated for two objectives in Figure 1.1. It is important to note that this thesis considers algorithms to work out towards the Pareto Frontier. This means that it interprets a MOP as a collection of maximization problems. If a specific problem originally concerned minimization, it will have been inverted to suit the interpretation. Although the Pareto Frontier is often a large and at times infinite set of solutions, the classical approach to solving optimization problems will only find one solution at a time, making it rather slow [6]. An alternative approach to solving MOPs is through Evolutionary Computation, which makes use of Evolutionary Algorithms. Evolutionary Algorithms (EA) are able to find multiple solutions to a problem in one run, which

11 3 makes them a natural mechanism for solving MOPs [6]. Although current EA provide a good approach to solving MOPs, there is still much room for improvement. As one can imagine, the number of points in the Pareto Frontier typically increases greatly with the number of objectives. A large number of objectives also causes the domination of points near the Pareto Frontier by closer points to become rare. This leads even EA to have more difficulty finding the Pareto Frontier for MOPs with a greater number of objectives [15]. For this reason, finding ways to solve MOPs with four or more objectives has become a very active area of research. This thesis focuses on this problem and proposes a new algorithm that is designed specifically for such problems. It introduces a new type of function that allows for a nested algorithm and tests it on two different MOPs. This thesis has the following structure: Chapter 2 gives background information for the subjects covered and gives an overview of what has been done with multi-objective evolutionary algorithms in the literature. Chapter 3 introduces Pareto-monotone functions and gives an outline of the proposed algorithm. Chapter 4 tests the algorithm on a trivial test problem and compares the results with another algorithm. Chapter 5 tests the algorithm on a harder problem involving cellular automata in which the Pareto Frontier is unknown. Finally, Chapter 6 draws conclusions from the tests performed and outlines future areas of research.

12 Chapter 2 Background Information and Literature Review 2.1 Preface The following chapter will first introduce some terms from evolutionary computation and multi-objective optimization that will be used throughout this thesis and then give a review of what has been done with Multi-objective Optimization Evolutionary Algorithms (MOEA) in the literature. In the Background Information section, the Evolutionary Computation information was drawn from [1], the Multi-objective Optimization information from [6], the Genetic Programming information from [5] and [18] and the Cellular Automata information from [23]. 2.2 Background Information Evolutionary Computation Evolutionary Computation is a branch of computer science that implements a simplified version of Darwin s Theory of Evolution in an algorithm. It can be used to find solutions to many different types of problems. It works on populations of data structures and can be programmed to stop after a certain amount of time or when certain conditions are met. The basic structure of all evolutionary algorithms can be seen in Figure 2.1. The following are some basic definitions for terms in Evolutionary Computation that 4

13 BACKGROUND INFORMATION 5 Figure 2.1: A basic outline for an Evolutionary Algorithm. will be used in this thesis. Definition 1 A population in an evolutionary algorithm is a collection of individual data structures that take part in evolution. Definition 2 An individual is a single member of a population. Definition 3 The mating event is where the algorithm simulates sexual reproduction between individuals. Definition 4 Tournament selection is one of a number of selection operators. It picks which individuals will be involved in a mating event. Tournament selection chooses a specified number of population members and ranks them based on their fitness. The two members of the tournament with the best fitness become the parents and create two new children which preserve some of the parents structure, while undergoing crossover and mutation so they are not clones. The children replace the two members of the tournament with the worst fitness. There are many different selection operators which can produce any number of children from one to an entire population. Brood selection in Genetic Programming is an example of a selection operator that uses an entire population. Tournament selection typically produces two children, but can be changed to produce one. Producing two

14 BACKGROUND INFORMATION 6 Figure 2.2: Two-point crossover on binary strings. children is the most common choice because crossover produces two children and so saving them both prevents information loss. Definition 5 The fitness of an individual is the measure of quality by which it is compared to other members of the population. Fitness is evaluated with a fitness function which is dependent on the individual algorithm, as is whether minimizing or maximizing the fitness function is desired. Definition 6 The fitness landscape of a problem is the composition of all possible solutions together with their fitness. There is typically a trade off in EA between how well they excel at exploration, finding many diverse solutions, and exploitation, narrowing search to sections of the fitness landscape that appear to have high quality solutions and finding the optimal solution in that area. Definition 7 Two-point crossover is a type of binary variation operator that is employed by evolutionary computation to ensure that the children produced maintain some of the structure of their parents. It randomly choose two locations in the child. Before and after those locations, the child takes on the structure of one parent, while between those two locations, the structure of the other parent is used. The second child has a reverse structure to that of the first. This is depicted in Figure 2.2. Definition 8 Mutation is a type of unary variation operator that introduces a small change to the child in a mating event. It acts in an attempt to maintain diversity in the population.

15 BACKGROUND INFORMATION 7 Definition 9 A point mutation is a type of mutation that modifies a single loci within the child. Real-valued mutation is a type of point mutation that adds a random number sampled from a distribution to the loci modified, whereas discrete mutation replaces one of the available discrete values with another. The size of a mutation is typically a parameter of the random distribution used to modify the loci. Examples include standard deviation of a Gaussian and radius of an interval for uniform distribution. Definition 10 An algorithm is elitist if the current best member of the population cannot be replaced. Definition 11 A genetic algorithm is a type of EA that operates on a data structures of a fixed size and uses both mutation and crossover as variation operators. Definition 12 A steady-state genetic algorithm is one that proceeds one selection event at a time. Definition 13 A generational genetic algorithm is one that updates the entire population at the same time Multi-objective Optimization Multi-objective Optimization is a field that concerns problems where one is optimizing more than one objective. As a result, there are multiple different solutions that are all considered optimal. Whether the optimal solution involves minimizing or maximizing objectives is completely dependent on the individual problem. The following are some basic definitions for terms in Multi-objective Optimization that will be used in this thesis. Definition 14 An objective, also referred to as a criterion, in multi-objective optimization is a function that is to be optimized.

16 BACKGROUND INFORMATION 8 Definition 15 A solution is considered Pareto-optimal if no single objective can be improved without a detrimental effect on one or more other objectives. Definition 16 The Pareto Frontier is the set of all Pareto-optimal solutions. Definition 17 One solution dominates another if it is better than the other in terms of all objectives. A solution is non-dominated if there is no solution that dominates it. A vector v Pareto-dominates w if, coordinate by coordinate, each entry of v is better than each entry of w. Definition 18 Two vectors are Pareto-comparable if one vector Pareto-dominates another. Definition 19 A many-objective optimization problem is a multi-objective optimization problem that has four or more objectives. The inherent difficulty in many-objective optimization problems is illustrated in the following analogy. Consider two fitness vectors for a MOP, v and w. For any one objective, one vector is better and the other is worse at that objective. In the best case scenario, the objectives are independent (this is rare, as most objectives in an interesting problem fight against one another) and so may be treated as flipped coins. Thus, for n objectives, there are two outcomes that allow for Pareto-comparison and a total of 2 n outcomes. This means that the probability, with the optimistic assumption that the objectives are independent, of finding that one solution dominates another is 2 = 2 (n 1). As the number of objectives, n increases, it is evident that this probability 2 n rapidly becomes incredibly small. MOEA are usually evaluated on two fronts: i) how close the points they find are to the Pareto Frontier and ii) how well spread along the Pareto Frontier the points are. It is important to note that in applied problems, the actual Pareto Frontier is frequently

17 BACKGROUND INFORMATION 9 Figure 2.3: An example of a Function Stack with 12 nodes. Output is taken from node 0. The lowest four nodes, I 0 I 3 are intake nodes that provide various weighted averages of entries of the fitness vector. The other nodes are PPM operations. unknown and so cannot be used as a comparison for either of these qualities with the results of an MOEA. The cellular automata problems that form the core of the research in this thesis are one such applied problem Genetic Programming Genetic Programming is an area of Evolutionary Computation where a computer program is created automatically from a high-level definition of the problem that needs to be solved. It attempts to get a computer to solve a complex problem from a description of what to do without telling it how exactly to solve the problem. These computer programs are often represented by tree structures. A special type of Genetic Programming, called Cartesian Genetic Programming, uses a directed graph representation for the computer programs it attempts to create [20]. These graphs allow for more than one path to form between two nodes. A function stack is one example of a directed graph that can be used for Cartesian Genetic Programming and is depicted in Figure 2.3. The function stack shown computes I 0 +I 2 +I (I 0 I 2 +I 2 0) and does not use nodes 2, 3, 5, and I 3.

18 BACKGROUND INFORMATION Cellular Automata Cellular Automata (CA) are discrete models of computation. They can be divided into three parts. 1. A collection of cells, where each cell is associated with a neighbourhood of surrounding cells 2. A set of states that cells can have 3. A rule which maps the set of possible cell states of a neighbourhood to a new state for that neighbourhood s cell The CA in this thesis are based on those described in [4]. They are represented by a linear array of 201 cells that wrap at the ends numbered 0,1,...,200. The neighbourhood of a cell i is the group of five cells surrounding it, {i 2, i 1, i, i + 1, i + 2}. The states that a cell can have are the numbers 0, 1, 2, 3, 4, 5, 6 and 7. The rules for updating the CA are specified as an array of 36 cell states. To compute a new state for cell i, a rule sums the states of the cells in i s neighbourhood. This sum will be in the range 0 n 35 which is used to index the rule array to determine the new state. When these CA are used in evolution, the target of evolution is the CA s set of updating rules. Definition 20 A quiescent state is another term used for a dead state. These states are such that a cell which has a neighbourhood consisting of all quiescent states will always update to a quiescent state. The CA used in this thesis have the zero state as the quiescent state. Definition 21 A CA is apoptotic if, after some finite number of updatings, all cells are mapped to the zero state. It is t-apoptotic if that finite number is less than or equal to t.

19 LITERATURE REVIEW 11 Definition 22 When evaluating a CA, an initial state or seed for the cells must be supplied. In this thesis, a seed will have the form 0...0ABA0...0, where A and B can be any of the numbers 0 through 7 as long as A and B are not both zero. The seed supplies the initial conditions for the CA. Note that B is located in the centre of the linear array of the CA, state number 100 for this thesis. Definition 23 The fitness function that will be used in evolution is the apoptotic fitness of a CA. For a number of cells n and a time limit t (in terms of number of updatings or time steps), this is defined for a given seed as the number of live, or non-zero, cells that appear in its time history. If the CA is not t-apoptotic, then it is assigned a fitness of zero. Definition 24 The time history of a CA is an array. The first row of the array is the initial state of the automata and succeeding rows are the results of each updating. When evaluating an apoptotic CA, one does not look at the time history after the time limit. Figure 2.4 shows a picture of a time history for an apoptotic cellular automata. The rows of the image show the states of the CA as represented with 7 different non-white colours from its initial conditions, through 201 time steps. The quiescent state represented by 0 is mapped to white. 2.3 Literature Review Although research in evolutionary computation became widespread in the 1970 s, the first credited use of an evolutionary algorithm for multi-objective optimization came in 1984 [6] [26]. David Schaffer s vector-evaluated genetic algorithm (VEGA) modified a single-objective genetic algorithm to find multiple trade-off solutions. His work was to be left on its own until Multi-objective Optimization Evolutionary Algorithms (MOEA)

20 LITERATURE REVIEW 12 Figure 2.4: An image of an apoptotic cellular automata as seen through its time history. were again brought to attention when David Goldberg sketched out a genetic algorithm that used the idea of domination [11]. This inspired the creation of several MOEA that were tested on real world solutions [10]. These MOEA included the Niched Pareto Genetic Algorithm (NPGA) which used non-domination in its selection process and a niching pressure to spread the solutions along the Pareto Frontier [12]. Another one of the early, but influential MOEA in the literature is the Non-dominated Sorting Genetic Algorithm (NSGA), which was proposed by N. Srinivas and K. Deb in 1994 [21]. It ranks points based on their non-domination and incorporates this rank in selection, in an attempt to find an even distribution of points along the Pareto Frontier. The best rank is zero, assigned to all of the non-dominated points. The set of points that are only dominated by non-dominated points are given a rank of one. This process is repeated for all points have been ranked. All of the points along the Pareto Frontier should be non-dominated, thus rewarding non-domination should help steer solutions towards the Pareto Frontier. NSGA uses the rank of a point as a surrogate for its fitness. The ranking process is illustrated in Figure 2.5. About five years after this first wave of MOEA, it was shown that preserving currently non-dominated points in an archive to include in future generations was quite beneficial to finding solutions [27]. This technique is referred to as elitism and lead to the creation of multiple new algorithms which adopted it, also known as elitist algorithms [8] [29] [17].

21 LITERATURE REVIEW 13 Figure 2.5: Different points sorted into ranks. One of these elitist algorithms, the Strength Pareto Evolutionary Algorithm (SPEA), was proposed in 1999 and uses an external archive that stores all non-dominated points found in a population [29]. Each point in the population and the archive is given a strength value based on the number of points it dominates or is considered equal to in terms of domination. These strength values are used in the selection process, giving an advantage to non-dominated points as the population is combined with the archive for the mating selection. Another elitist algorithm that was proposed around the same time was the Pareto Archived Evolution Strategy (PAES) [17]. It also makes use of an archive of non-dominated solutions, but does not involve the archive solutions in the selection process, preserving them for reference only. The new advancements in MOEA also inspired improvements to previously proposed algorithms. In 2002, a new version of NSGA, aptly named NSGA-II, was designed to take advantage of techniques like elitism [8]. NSGA-II decreased the computational complex-

22 LITERATURE REVIEW 14 ity of NSGA by simplifying the non-domination sorting process, as well as eliminating the need for a sharing function which was previously used to maintain the diversity of solutions. The sharing function required a user set parameter and was replaced by a crowding measure that eliminated this requirement and decreased the diversity measure s complexity. Like SPEA, NSGA-II combined both archive points and population members into a common pool for mating selection. SPEA was also improved upon, with the result being SPEA2 [28]. SPEA2 introduced an improved fitness function, by which individual members of a population are judged, which considers density as well as the non-domination factors that SPEA used. It also improved the method of truncating the archive, in order to maintain a reasonable number of non-dominated points while preserving more diversity than SPEA. Another classical approach to solving MOPs which has been incorporated into genetic algorithms is that of decomposition. MOEA/D, first introduced in 2007, was one of the first Pareto-based MOEA to do so [24]. It takes a MOP and separates it into multiple scalar optimization problems that are then optimized concurrently with an evolutionary algorithm. Subproblems are associated with these scalar problems and neighbourhood relationships are determined between them. Operating under the supposition that neighbouring subproblems should have similar solutions, information for close neighbours is shared in order to optimize the solutions. There have been a few advancements to the MOEA/D method; MOEA/D-DE [19] was introduced with new operators to improve exploration, including a differential evolution operator, as well as new measures to preserve diversity, and [25] suggested a method of dynamically allocating computational resources that improved the efficiency of MOEA/D [26]. Although a lot of research has been done on Pareto-based MOEA, there has also been work done to develop non-pareto based methods. These methods do not directly compare the non-domination/domination of population members with each other and so implicitly create a Pareto set [14]. One such example is the technique of Multiple Single

23 LITERATURE REVIEW 15 Objective Pareto Sampling (MSOPS), which was proposed in It was created as an attempt to tackle many-objective problems with which the popular MOEA of the time were having difficulties. It used target vectors to do a parallel search of multiple single objectives to solve a MOP [14]. MSOPS was improved upon in 2007, with the proposal of MSOPS-II, which attempted to reduce the amount of initial configuration required in the algorithm [13]. As this thesis suggests, some of the most recent work in the field of evolutionary computation for multi-objective optimization has been focusing on improving the efficiency of MOEA on problems with four or more criteria. This is due to the fact that most of the currently popular MOEA suffer from what is referred to as the curse of dimensionality [16]. This is the problem that occurs when a large number of objectives are used, as the domination of points near the Pareto Frontier by closer points becomes rare. This causes the selective pressure that is forcing the algorithms to find the Pareto Frontier to decrease dramatically, as do the efficiency of the algorithms [15]. At the time of writing this thesis, some of the most recent developments include the proposal of NSGA-III [7] which is based on the framework of NSGA-II and attempts to solve MOPs with up to 15 criteria, focusing mainly on real valued problems.

24 Chapter 3 Concept As was stated in the introduction, the motivation for this paper is to produce a new Multi-objective Optimization Evolutionary Algorithm that can handle many-objective optimization problems. It will define a new type of function, Pareto-monotone, which will be used to partition a multi-objective problem into a space of single-objective problems. The Pareto-monotone functions will be used in a new nested evolutionary algorithm called Multi-objective Optimization Nested Evolutionary Algorithm (MONEA) in an attempt to find a computationally intensive but practical solution to the curse of dimensionality. 3.1 Basic Setup Suppose that simultaneous optimization of k distinct real-valued functions, f 1, f 2,..., f k is being performed. A fitness vector v R k is any vector of the form (f 1 ( x), f 2 ( x),..., f k ( x)) where x R n is a set of values from the parameter space being optimized against. Definition 25 A function g : R k R is said to be Pareto-monotone(PM) if whenever v Pareto-dominates w then g( v) > g( w). 16

25 BASIC SETUP 17 It is elementary to verify that the following list of functions are PM. 1. Projection onto a single coordinate, 2. Scaling by a positive constant (e.g. f(x) = cx for a constant c > 0), 3. Summation, 4. Odd powers and roots (these exist for negative numbers), 5. e x. Definition 26 A function is positive Pareto-monotone(PPM) if it is Pareto-monotone when its arguments are restricted to positive values and it is strictly positive for all such values. Note that PM functions are PPM. It is elementary to verify that the following list of functions are PPM. 1. Product, 2. Any positive root or power, 3. The harmonic mean, 4. Ln(x + 1). The following claim provides a simple test to determine if a function is PM. Claim 1 For some convex region of n-dimensional space R, a function f : R R is PM if its partial derivatives are strictly positive on the interior of R. Proof:

26 BASIC SETUP 18 Suppose there exists a function f : R R with partial derivatives that are positive in R and two vectors, p and q such that q dominates p. Then d = q p q p is the unit vector from p to q and s(t) = t d where p = s(t 0 ) and q = s(t 1 ) is the direct path from p to q. Then: f( q) f( p) = = t1 t 0 t1 t 0 > 0 f( q) > f( p) f s (t)dt f d dt Notice that the interval is positive because the gradient and the unit vector are both strictly positive and it is apparent that f satisfies the definition of Pareto-monotonicity. The functions listed in Definitions 25 and 26 yield a rich set of primitives with which to work. With the goal of building a genetic programming system which consists entirely of PM or PPM functions, Lemma 1 is required. Definition 27 Notice that the relationship v Pareto-dominates w is a partial order on R n. We will denote this partial order by (R n, ). Lemma 1 The composition of PM functions is PM, the composition of PPM functions is PPM. Proof:

27 ALGORITHM OVERVIEW 19 Notice that the definition for a PM function is equivalent to a homomorphism in the category of partial orders on R n whose image is in R. It is easy to see that (R, ) is a special case of (R n, ). Thus, since homomorphisms are closed under composition, so are PM functions. A similar proof for PPM functions can be devised by restricting to the positive orthant. Lemma 1 allows for the preservation of dominance when a MOP is split up into multiple single-objective optimization problems (SOP) and when a SOP is composed of different PM or PPM functions. MONEA is designed to take advantage of these facts by using many different approaches in an attempt to locate the Pareto Frontier of a given MOP. 3.2 Algorithm Overview The following paragraph walks through the steps of MONEA. The outer loop of the algorithm evolves single-objective fitness functions. Each of these functions defines a different path to the Pareto Frontier. The fitness function is a PPM combination of PPM functions specified by the algorithm, which are chosen at random for the initialization. The algorithm then progresses to the inner loop, where the following actions take place for each function. The function is used as a fitness function for evolution on an inner loop population that runs a set number of mating events. Once this evolution is completed, an archive of solutions found with just that function is produced. This archive is compared to a master archive filled with all previously found non-dominated solutions. The archive is initialized as empty, but will quickly fill with solutions after the first iteration of the outer loop. The fitness function is given points that contribute to a fitness value for that function. It is given one point for finding a new non-dominated solution not in the master archive and two points for every solution in the

28 ALGORITHM OVERVIEW 20 Figure 3.1: Pseudocode outlining MONEA. master archive that the solution dominates. All the new solutions found by the function that are still non-dominated after comparison are saved into an individual archive. When this process has been completed for every fitness function, all the individual archives are merged into a temporary archive and the algorithm moves once again into the outer loop. Back in the outer loop, each fitness function now has a fitness value associated with it. The algorithm then merges the temporary archive with the master archive and casts out any dominated solutions. Thus, the master archive is updated. The algorithm then breeds the fitness functions, based on their given fitness values. By this point, the algorithm has looped around and brings the new fitness functions into the inner loop to start the nested evolution again. This process is repeated until the stopping criteria are met. Pseudocode outlining MONEA can be seen in Figure 3.1 and a general picture of how the parameters are translated into objectives and combined to get a general fitness can be seen in Figure 3.2. The fitness functions in MONEA are represented by function stacks which consist

29 ALGORITHM OVERVIEW 21 Figure 3.2: This illustrates how n parameters are mapped to m objectives and then combined with PM (or PPM) functions to get a combined fitness. of linear lists of nodes. The list of nodes consists of a collection of PPM functions followed by intake nodes. The intake nodes apply a PPM function to the fitness vector and produce scalar values. The remaining nodes apply PPM functions to these scalar values. The function stack thus encodes a PPM function. MONEA has three choices for the intake node PPM functions, a weighted average, a weighted geometric mean and a weighted harmonic mean, as seen in Table 3.1. From Definitions 25 and 26, Lemma 1 and the previously mentioned types of PM and PMM, it is easy to see that the three functions used in MONEA are PPM. The type of function used for any given node is initialized randomly and the weights are assigned randomly. Other nodes in MONEA combine any two of the values from the intake nodes in three possible ways: addition, multiplication and taking a positive power. Each fitness function is a PPM combination of all the PPM functions used in MONEA and serves as a different path to approach the Pareto Frontier. If the only PPM function used to search for solutions was a weighted average, the algorithm would search for solutions in a straight line approaching the Pareto Frontier. This is depicted for two objectives in Figure 3.3. The advantage to using multiple PPM (or PM) is that the paths to the Pareto Frontier have a lot more freedom. They can approach the Pareto Frontier in a variety of different ways, as is illustrated in Figure 3.4. The inner loop of MONEA is a steady-state genetic algorithm, which as [1] states,

30 ALGORITHM OVERVIEW 22 Figure 3.3: Approaching the Pareto Frontier for two objectives using only weighted average. Figure 3.4: Approaching the Pareto Frontier for two objectives using a composition of a variety of PM functions.

31 ALGORITHM OVERVIEW 23 Table 3.1: Intake Node Functions used in MONEA In the following Equations, x i are the values being acted on and w i are the weights associated with them. Weighted Average Weighted Geometric Mean Weighted Harmonic Mean ( n n w i x i i=1 n w i i=1 x w i i i=1 n w i i=1 n w i x i=1 i ) 1 n w i i=1 is best for evolution with a fixed fitness function. In contrast, the evolution in the outer loop is generational because the fitness landscape radically changes each time the master archive is updated, since the fitness of the outer loop is based on the master archive. Two-point crossover and mutation are used for mating events in both the outer and inner loops of MONEA. The outer loop has a tournament size of four, the smallest possible size. This small tournament size makes the selection pressure as soft as possible in an effort not to decrease diversity. The inner loop uses a tournament size of seven, as selection pressure is not as much of an issue here. MONEA has been setup to stop after a pre-specified number of passes through the outer loop of the algorithm. In this paper this stopping criteria will be referred to as generations. Most MOEA have some sort of crowding measure to ensure that the solutions they find are well spread along the Pareto Frontier. MONEA is intentionally designed in such a way that the very structure of the algorithm encourages exploration without the need for an additional crowding measure and the calculations that would ensue. This is seen

32 ALGORITHM OVERVIEW 24 in the generational aspects of MONEA. When the measure of what a fit fitness function is changes with the update of the master archive, it forces fitness functions to look for new solutions in the Pareto Frontier in order to achieve high fitness values. This should result in a more complete picture of the Pareto Frontier being produced by MONEA.

33 Chapter 4 Parameter Testing and Fine Tuning with a Trivial Test Function 4.1 Initial Test Function Initial analysis of MONEA was done on a simple test function where the Pareto Frontier was known in order to analyze the results. The frontier of the test problem used was N x 2 i = 1 where N is equal to the number of objectives used. As this is the test problem, i=1 the number of inner mating events was set to The design of MONEA restricts the solution to the positive orthant because it uses PPM functions, thus this test problem has a Pareto Frontier that forms a quarter circle of radius one when using two objectives, an eighth of the surface of a sphere using three objectives, etc. getting harder to visualize as the number of objectives increases. In this case, the fitness functions are set to coordinate values for points inside the Pareto Frontier and zero for points outside of the Pareto Frontier. In other words, the solutions are being maximized (pushed outwards) towards the Pareto Frontier, but if they surpass it, then they are eliminated. 25

34 PARAMETER TESTING Parameter Testing Setup Initial parameter testing on MONEA was done for two objectives. The parameters that were studied included the population size for the inner loop and the population size for the outer loop. The number of nodes used for the fitness functions and the number of those nodes that were intake nodes was also examined, as were the size and maximum number of mutations in the inner loop Results and Discussion The bulk of parameter testing was done on problems with two objectives, although some tests were done for a larger number of objectives. It was found that settings that worked well for two objectives were consistent with those that worked well for a larger number of objectives. All of the runs done that were completed without issue were able to find points within the Pareto Frontier. Better results are considered to come from runs where more points were found in the least amount of run time. From the parameter tests, a smaller mutation size and a smaller maximum number of mutations in the inner loop was found to yield the best results. The smaller mutation allows for smaller changes in solutions at a time. This permits the solutions to get close to the Pareto Frontier and then change just slightly, reducing the probability that the solutions will change too much and go beyond the Pareto Frontier, which would result in a fitness of zero. The analysis of the number of nodes and intake nodes used was inconclusive. The number of intake nodes should not exceed half of the total number of nodes, but there was not much variation in the results when the number of intake nodes fell between a quarter and half of the total nodes. A higher number of nodes did cause an increase in

35 COMPARISON WITH NSGA-II 27 Table 4.1: Optimal Parameters for MONEA Total Number of Nodes 12 Number of Intake Nodes 4 Outer Loop Population Size 120 Inner Loop Population Size 40 Inner Loop Maximum Number of Mutations 1 Inner Loop Mutation Size 0.1 run time, but it did not show marked improvement in the number of solutions found that would make the extra time worthwhile. Increasing the population size for the inner and outer loops did increase the number of solutions found, but it also dramatically increased the run time of the algorithm. If the outer population is increased, even by one function, then that adds an entire additional round of evolution in the inner loop, and if the inner population is increased, that is magnified by the population size of the outer loop. Thus it was determined that an intermediate population size would produce results that were reasonable both in number of points produced and run time. After parameter testing, an optimal set of parameter values was determined. They are described in Table Comparison with NSGA-II With MONEA finding so many solutions to the test problem, a comparison with the results of another MOEA will give some perspective on how MONEA has performed Fair Comparison Issue The usual way to compare two different EA is to ensure that they both have an equal number of mating events. The drastically differing nature of MONEA from any other

36 COMPARISON WITH NSGA-II 28 MOEA makes this quite difficult. The algorithm that was chosen for comparison with MONEA is NSGA-II, as, at the time of writing, it is one of the most well known and well used MOEA. If NSGA-II were to have the same number of mating events as MONEA, the run time of the program would be several orders of magnitude larger than that of MONEA, which, considering the fact that run time was one of the factors of what constituted a better performance, would not exactly be a fair comparison. This is in part because of the time NSGA-II spends computing a crowding measure. Note also that not all mating events in MONEA are dedicated to the same type of evolution as they are in NSGA-II. Many of the mating events in MONEA are involved with improving the fitness functions themselves, whereas in NSGA-II all of the mating events are involved directly with finding solutions to the problem. The solution that this paper uses to address the issue of fair comparison is a sort of compromise. The number of mating events for NSGA-II is increased until the run time of one experiment is approximately equal to that of an experiment of MONEA with the optimal parameter settings. NSGA- II also underwent similar parameter analysis to determine its optimal settings for the test problem and these were what was used for the experiments with NSGA-II Setup NSGA-II and MONEA were compared in experiments for two, three and six objectives. As stated previously, both MONEA and NSGA-II had optimized parameter settings. Ten runs of each algorithm were executed for the three different numbers of objectives. The resulting individual archives, which contained all of the non-dominated solutions found for each experiment, were then compared through a merging process. This involved taking two archives and merging all of the solutions together, casting out any solutions that may become dominated. The merging software keeps track of how many solutions from each archive survive the merging process, which allows for the creation of survival statistics for analysis.

37 COMPARISON WITH NSGA-II 29 Figure 4.1: Comparison of archives from MONEA and NSGA-II for two objectives with points generated by MONEA plotted second and hence able to obscure points from NSGA-II Results and Discussion In all cases, MONEA found significantly more solutions in the same amount of run time. NSGA-II produced archives with about 230 solutions per run, while MONEA was able to find to solutions. It is also worth noting that the number of points found for MONEA increased as the number of objectives increased, while NSGA-II remained fairly steady. A comparison between MONEA and NSGA-II can be easily visualized for two objectives. Figures 4.1 and 4.2 are two different plots of the same two archives, one found with MONEA, one with NSGA-II, for two objectives. In Figure 4.1 Solutions found with MONEA are depicted in green and NSGA-II solutions in red. When the green points

38 COMPARISON WITH NSGA-II 30 are put on top of the red points, they overpower the red so much as to make the red points invisible. In Figure 4.2, the same data is used, but this time MONEA solutions are coloured red and NSGA-II solutions are green. When The NSGA-II solutions are put overtop of the MONEA solutions, the situation becomes a bit clearer. There are many places where red points are still evident, showing sections of the Pareto Frontier that NSGA-II was unable to fill in but MONEA was. One should also note that the red is more apparent on the outer side of the curve, and since these algorithms are pushing outwards towards the Pareto Frontier, this implies that for these runs, the solutions from MONEA are of a better quality than those from NSGA-II. Another important observation is that although MONEA seems to overwhelm NSGA-II in this example, there is still red evident in the images, even in Figure 4.1. This means that NSGA-II was able to find points that MONEA was not. There are also parts of the Pareto Frontier that have yet to be filled in, thus, even combining results from the two algorithms, there are still solutions yet undiscovered. When the number of objectives increases above two, it becomes harder to analyze the results visually. For this reason, archive merging software was used to compare archives of solutions found with MONEA and NSGA-II. Two survival statistics were created to examine these results. The first survival statistic compares the number of surviving solutions in a merged archive found using a specific algorithm (either MONEA or NSGA-II) to the total number of solutions in the merged archive. The second survival statistic compares the number of surviving solutions found using a specific algorithm to the number of solutions found by that algorithm before the merger. Both of these fractions are multiplied by 100 to turn them into percentages. The survivor statistics are illustrated in Table 4.2. Comparisons were made between runs by MONEA and NSGA-II for two, three and six objectives. The survival statistics for MONEA with two, three and six objectives can be found in Tables 4.3, 4.5 and 4.7. In contrast, the survival statistics for NSGA-II

39 COMPARISON WITH NSGA-II 31 Figure 4.2: Comparison of archives from MONEA and NSGA-II for two objectives with points generated by NSGA-II plotted second. Table 4.2: Survival Statistics Overall Survivors Algorithm Survivors # of Survivors from Archive X total # of Survivors 100 # of Survivors from Archive X # Solutions in Archive X 100

40 COMPARISON WITH NSGA-II 32 with two, three and six objectives can be found in Tables 4.4, 4.6 and 4.8. It is immediately apparent that solutions from MONEA fare much better than those from NSGA-II when they are merged together. This means that not only did MONEA find more points than NSGA-II, but it found more points that are not dominated by points from NSGA-II, which speaks to their high quality. As was apparent with the graphic example for two objectives, NSGA-II was still able to produce solutions that were not dominated by the solutions from the MONEA archives. Thus NSGA-II is still finding solutions that MONEA is not and the Pareto Frontier is not entirely covered by either algorithm. In the original design of MONEA, the weighting of rewards to the fitness of fitness functions that were able to find new solutions not in the master archive and ones that eliminated existing solutions from the archive were close to equal (1 and 2 respectively, where higher fitness values are better). The results of these tests indicate that MONEA is very good at finding solutions very close to the Pareto Frontier. This suggests that redistributing the reward weight in favour of finding new solutions, i.e. exploration, has the potential to improve MONEA s performance. Additional experiments were done where the reward for finding new solutions was adjusted from one point to ten. When the archives from these experiments were compared with those previously found by MONEA, the changes proved beneficial. Although the old and new MONEA archives were approximately the same size before merging, the majority of the solutions in the merged archive came from the exploration oriented MONEA archives. This new fitness reward weighting has been incorporated into MONEA for future use.

41 COMPARISON WITH NSGA-II 33 Table 4.3: MONEA Survival Statistics for Two Objectives Run % Overall Survivors % Algorithm Survivors Average ±95%CI 99.8 ± ± Table 4.4: NSGA-II Survival Statistics for Two Objectives Run % Overall Survivors % Algorithm Survivors Average ±95%CI ± ± 3.18

42 COMPARISON WITH NSGA-II 34 Table 4.5: MONEA Survival Statistics for Three Objectives Run % Overall Survivors % Algorithm Survivors Average ±95%CI 99.9 ± ± Table 4.6: NSGA-II Survival Statistics for Three Objectives Run % Overall Survivors % Algorithm Survivors Average ±95%CI ± ± 3.12

43 COMPARISON WITH NSGA-II 35 Table 4.7: MONEA Survival Statistics for Six Objectives Run % Overall Survivors % Algorithm Survivors Average ±95%CI 99.9 ± ± Table 4.8: NSGA-II Survival Statistics for Six Objectives Run % Overall Survivors % Algorithm Survivors Average ±95%CI ± ± 2.21

44 Chapter 5 Testing on Cellular Automata 5.1 Problem Description The cellular automata that this paper uses are as described in Chapter 2. A MOP can be devised for finding such CA with a given number of CA seeds to start the search. The seeds serve as the objectives for the MOP. Such a MOP serves as a good test for MOEA as it provides a clean test on a hard problem. The fitness functions for the problem will all have exactly the same form, and since the seed values are of the form ABA where A and B can be the numbers 0 through 7 provided A and B are not both zero, the problem can have up to 63 different objectives. As [4] states, the fitness landscape for CA such as the ones used here are rugose, or wrinkled, which makes it a challenging test problem. It is important to note that for the CA MOP, the Pareto Frontier is unknown and so results cannot simply be compared to a pre-existing set to determine their quality. 5.2 Fair Comparison Issue The lack of a known Pareto Frontier for the CA MOP presents difficulty when analyzing the results. In the following experiments, results from MONEA are compared with those from NSGA-II. Again, the issue of fair comparison arises. The number of inner mating events of MONEA was set to 100. Even with this change, the number of mating events in MONEA and NSGA-II are not equal and even if they were, as outlined in Chapter 4, 36

45 SETUP 37 it would not be a fair comparison as in MONEA not all mating events can be considered equal. To this end, additional tests were done with NSGA-II and will be discussed in 5.4. NSGA-II was also given additional runs for all of its experiments, which would give it more of an advantage in the comparisons. 5.3 Setup Experiments of MONEA and NSGA-II were done for CA problems with 2, 3, 4, 5, 6, 7, 8 and 10 objectives. Since CA seeds were the objectives, a problem with two objectives used two different seeds, one with three objectives used three different seeds, etc. Eight experiments were done for each algorithm at each objective level. The same set of seeds was used for both the NSGA-II and MONEA experiments. These seeds were a sampling of the 63 different possible seed values. Four additional experiments were done for MONEA and NSGA-II with additional initial conditions using two objectives. MONEA used the parameters that were described in Table 4.1 for 100 generations. Each different experiment using MONEA produced three separate runs, while NSGA-II produced five. The resulting archives for each run of a given algorithm with the same initial conditions were first merged together into one super archive for each set of seeds, with the same software described in Chapter 4. This software was then used to merge the super archives of MONEA and NSGA-II for each number of objectives. Analysis on the resulting data was done using the same survival statistics as described in Chapter Results and Discussion Both MONEA and NSGA-II were able to successfully find solutions to these CA problems in all objectives tested. An example of the CA that were found with one of the surviving rules from the experiment with MONEA acting on 10 objectives can be seen in Figure 5.1. Each square in the figure contains a time history for a CA created with

46 RESULTS AND DISCUSSION 38 Figure 5.1: Cellular Automata found for the rule using all 63 different seeds. one of the 63 different initial conditions used. The green squares represent CA that did not stop in time, i.e. were not t-apoptotic and the grey squares represent CA that did not grow further than their seeds. Table 5.1 gives the state to colour translation for rendering automata. Even with four extra experiments, the results for two objectives were largely inconclusive. The number of solutions that each algorithm was able to find were so small, under 10 solutions per archive and most closer to 5, that the data fluctuated from run to run, which made it difficult to capture an overall conclusion. For two objectives, both MONEA and NSGA-II found roughly the same number of

47 RESULTS AND DISCUSSION 39 Table 5.1: State to Colour Assignments for Cellular Automata State Colour 0 White 1 Blue 2 Green 3 Red 4 Cyan 5 Violet 6 Yellow 7 Black solutions. As the number of objectives increased, both algorithms found increasingly more solutions; however, the rate at which MONEA found more solutions was much higher than that of NSGA-II, see Figures 5.2 and 5.5. For a smaller number of objectives, NSGA-II outperformed MONEA in terms of overall survivors and algorithm survivors. As the number of objectives increased, however, the curse of dimensionality kicked in and MONEA began to outperform NSGA-II. The trends can be seen in Figures 5.2 and 5.3. MONEA s percentage of overall survivors became higher than NSGA-II s at a lower objective level than its percentage of algorithm survivors, but they did both become the better statistics. MONEA s success in the overall survivors statistic can in part be explained by the large number of solutions MONEA was able to find. Although this higher number of solutions influenced the overall survivor statistic, the solutions themselves were of a high enough quality that they survived merging with those found by NSGA-II and were not dominated by any of those solutions. Figure 5.4 serves as evidence that NSGA-II suffered the curse of dimensionality. It depicts the average non-domination rank of solutions throughout individual runs of NSGA-II for 2 to 10 objectives. The higher the average number of ranks is, the more solutions the non-dominated ones can dominate. From this figure, it is evident that

48 RESULTS AND DISCUSSION 40 Figure 5.2: Averages of the overall survivor statistics for each objective level are shown for MONEA and NSGA-II.

49 RESULTS AND DISCUSSION 41 Figure 5.3: Averages of the algorithm survivor statistics for each objective level are shown for MONEA and NSGA-II.

50 RESULTS AND DISCUSSION 42 Figure 5.4: Average Rank of Solutions for runs in NSGA-II.

51 RESULTS AND DISCUSSION 43 Figure 5.5: Average Number of non-dominated solutions found for MONEA and NSGA- II. although all runs started out with an average between 40 and 50, as soon as more than three objectives were involved, the average rank rapidly dropped down to a fraction less than one. This means that almost all of the solutions were non-dominated and the pressure to push towards the Pareto Frontier dropped drastically. Figure 5.5 shows the dramatic difference in the number of non-dominated solutions found by MONEA and NSGA-II as the number of objectives increased. This demonstrates that MONEA is much better at discovering new non-dominated points than NSGA-II. The algorithm survival statistics for MONEA tended to improve as the number of objectives increased and reached an average of 98.0±1.35 for ten objectives. This shows that the solutions in the MONEA archive were diverse enough that the majority were not dominated by solutions from NSGA-II. In comparison, the NSGA-II survival

52 RESULTS AND DISCUSSION 44 Figure 5.6: Cellular Automata found for the rule using all 63 different seeds. statistics tended to stay between 60 and 70 percent, as can be seen in Figure 5.3. Figures 5.6 through 5.10 show additional examples of CA created from surviving rules from 10 objective MONEA, in the same manner as Figure 5.1. After examining several examples, it becomes evident that the rules can be classified into two different categories: those that consistently produce a moderately sized CA for a majority of the 63 initial conditions (e.g. the automata shown in Figure 5.6), and those that produce a smaller number of large CA and small CA for the majority of the remaining initial conditions (e.g. the automata shown in Figure 5.7). From examining the various CA in the individual figures, one can see that although a number of different shapes occur

53 RESULTS AND DISCUSSION 45 within the various CA, each rule seems to generate CA with the same colour scheme regardless of the seeds used. Even with the variation in shapes, a general shape can also be identified for each rule. This demonstrates how each rule is pushing evolution towards a specific set of CA. The variety of colour and shape seen in the included examples also speaks to the diversity MONEA is able to obtain in its solutions. Additional deep-time tests were done with NSGA-II on the CA MOP. This means that NSGA-II was allowed to run for an extremely large number of generations, larger than any of the main experiments. When the solutions found by these deep-time runs were compared to the MONEA and NSGA-II archives from the main experiments, the main experiments were decimated. Few, if any of the main solutions survived the mergers. This demonstrates that neither algorithm was able to find the Pareto Frontier in the time given for the main experiments; there were still better solutions to be discovered. The reason deep-time NSGA-II solutions were able to beat MONEA solutions is likely because in comparison, MONEA had a very short internal loop. It was only given 100 mating events for the inner loop, whereas deep-time NSGA-II had The internal loop is where mating events similar to those in NSGA-II take place in MONEA. The fact that they are relatively short means that they were not given a lot of time to fine-tune the solutions, unlike the deep-time NSGA-II runs.

54 RESULTS AND DISCUSSION 46 Figure 5.7: Cellular Automata found for the rule using all 63 different seeds.

55 RESULTS AND DISCUSSION 47 Figure 5.8: Cellular Automata found for the rule using all 63 different seeds.

56 RESULTS AND DISCUSSION 48 Figure 5.9: Cellular Automata found for the rule using all 63 different seeds.

57 RESULTS AND DISCUSSION 49 Figure 5.10: Cellular Automata found for the rule using all 63 different seeds.

58 Chapter 6 Conclusions and Future Work 6.1 Conclusions This thesis has outlined and implemented a novel multi-objective optimization evolutionary algorithm, MONEA, and demonstrated its ability to find solutions for two separate MOPs. It introduced a new class of functions, PM and PPM, which preserved domination and allowed a MOP to be broken down into many SOPs. The PM and PPM functions were represented through a new implementation of function stacks; utilizing intake nodes to interpret the objectives. MONEA was tested on two problems and compared to NSGA-II. It was determined in Chapter 4 that MONEA works better with a smaller mutation size. The other aspects of the algorithm preserve enough diversity that the higher mutation size is not needed. Initial testing also showed that a stronger reward for exploration can increase this diversity even more. This was supported by findings in Chapter 5 which show that MONEA is good at capturing a broader picture of the Pareto Frontier, i.e. it has a more diverse set of solutions than NSGA-II. When experiments were done on a CA MOP, it was determined that MONEA did a better job of covering the Pareto Frontier, but required more computational resources than NSGA-II. MONEA demonstrated an excellent ability to locate a diverse approximation to the Pareto Frontier, but there were still solutions closer to the Pareto Frontier 50

59 FUTURE WORK 51 that were yet to be found. The deep time NSGA-II runs demonstrated the existence of better solutions, but MONEA s shorter inner loop length stopped it from reaching them. 6.2 Future Work As this thesis introduces an entirely new MOEA, there are many more avenues of research to explore. From the conclusions made above, it is evident that MONEA and NSGA-II both have different areas in which they excel when solving MOPs. This naturally leads to the idea of combining them in a way that will take advantage of both algorithms strengths. Further testing with MONEA for deep time analysis needs to be done, where the length of the inner population is increased. There is great potential for improved results if one were to use solutions from the master archive of MONEA as the initial population of NSGA-II for a given MOP. This may let NSGA-II s power leverage MONEA s search ability and lead to far better solutions. MONEA should also be tested on additional MOPs. Discrete functions such as those found in [2] and [3] should provide similar tests to the CA problem from Chapter 5. For these problems, instead of evolving CA, the final solutions would generate mazes such as the one seen in Figure 6.1. Here the objectives would be multiple paths that have different features, the blue represents water, the red represents fire and the black represents stone, so a blue/black path would block some players that could go through fire but not water, and so on. Real valued test problems also need to be tested with MONEA and the soon to be published NSGA-III should serve as a good comparison algorithm. The large numbers of solutions that MONEA was able to find for MOPs in both Chapters 4 and 5 suggest that some kind of archive filter may be a useful addition to the algorithm. The filter would need to be designed in such a way that it preserves the

60 FUTURE WORK 52 Figure 6.1: An example of a maze from [2] with different paths that can only be traversed by specific players where blue represents water, red represents fire and black represents stone. This image is drawn from research performed by Cameron McGuinness. diversity MONEA has been shown to produce, while maintaining a smaller archive. The substantial master archives MONEA had, had a negative effect on the run time of the algorithm, thus a smaller archive may improve its performance. Another way to improve MONEA s performance that should be investigated is to involve parallel programming. The runs in MONEA are completely independent of one another, and so the algorithm would see immediate linear improvement if they were to be run in parallel.

Multi-objective Optimization

Multi-objective Optimization Jugal K. Kalita Single vs. Single vs. Single Objective Optimization: When an optimization problem involves only one objective function, the task of finding the optimal solution is called single-objective

More information

Evolutionary Computation

Evolutionary Computation Evolutionary Computation Lecture 9 Mul+- Objec+ve Evolu+onary Algorithms 1 Multi-objective optimization problem: minimize F(X) = ( f 1 (x),..., f m (x)) The objective functions may be conflicting or incommensurable.

More information

Computational Intelligence

Computational Intelligence Computational Intelligence Winter Term 2016/17 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund Slides prepared by Dr. Nicola Beume (2012) Multiobjective

More information

Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA

Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA Kalyanmoy Deb, Amrit Pratap, and Subrajyoti Moitra Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute

More information

Evolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.

Evolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague. Evolutionary Algorithms: Lecture 4 Jiří Kubaĺık Department of Cybernetics, CTU Prague http://labe.felk.cvut.cz/~posik/xe33scp/ pmulti-objective Optimization :: Many real-world problems involve multiple

More information

Chapter 14 Global Search Algorithms

Chapter 14 Global Search Algorithms Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.

More information

Evolutionary multi-objective algorithm design issues

Evolutionary multi-objective algorithm design issues Evolutionary multi-objective algorithm design issues Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology Karthik.sindhya@jyu.fi

More information

Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems

Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems Tony Maciejewski, Kyle Tarplee, Ryan Friese, and Howard Jay Siegel Department of Electrical and Computer Engineering Colorado

More information

Efficient Non-domination Level Update Approach for Steady-State Evolutionary Multiobjective Optimization

Efficient Non-domination Level Update Approach for Steady-State Evolutionary Multiobjective Optimization Efficient Non-domination Level Update Approach for Steady-State Evolutionary Multiobjective Optimization Ke Li 1, Kalyanmoy Deb 1, Qingfu Zhang 2, and Sam Kwong 2 1 Department of Electrical and Computer

More information

Multi-objective Optimization

Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001 Multi-objective Optimization Implementation of Constrained GA Based on NSGA-II Optimization

More information

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China

More information

The Genetic Algorithm for finding the maxima of single-variable functions

The Genetic Algorithm for finding the maxima of single-variable functions Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding

More information

Development of Evolutionary Multi-Objective Optimization

Development of Evolutionary Multi-Objective Optimization A. Mießen Page 1 of 13 Development of Evolutionary Multi-Objective Optimization Andreas Mießen RWTH Aachen University AVT - Aachener Verfahrenstechnik Process Systems Engineering Turmstrasse 46 D - 52056

More information

Using Genetic Algorithms to Solve the Box Stacking Problem

Using Genetic Algorithms to Solve the Box Stacking Problem Using Genetic Algorithms to Solve the Box Stacking Problem Jenniffer Estrada, Kris Lee, Ryan Edgar October 7th, 2010 Abstract The box stacking or strip stacking problem is exceedingly difficult to solve

More information

Graphical Approach to Solve the Transcendental Equations Salim Akhtar 1 Ms. Manisha Dawra 2

Graphical Approach to Solve the Transcendental Equations Salim Akhtar 1 Ms. Manisha Dawra 2 Graphical Approach to Solve the Transcendental Equations Salim Akhtar 1 Ms. Manisha Dawra 2 1 M.Tech. Scholar 2 Assistant Professor 1,2 Department of Computer Science & Engineering, 1,2 Al-Falah School

More information

Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts

Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts IEEE Symposium Series on Computational Intelligence Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts Heiner Zille Institute of Knowledge and Language Engineering University of Magdeburg, Germany

More information

CS5401 FS2015 Exam 1 Key

CS5401 FS2015 Exam 1 Key CS5401 FS2015 Exam 1 Key This is a closed-book, closed-notes exam. The only items you are allowed to use are writing implements. Mark each sheet of paper you use with your name and the string cs5401fs2015

More information

Multiobjective Optimisation. Why? Panorama. General Formulation. Decision Space and Objective Space. 1 of 7 02/03/15 09:49.

Multiobjective Optimisation. Why? Panorama. General Formulation. Decision Space and Objective Space. 1 of 7 02/03/15 09:49. ITNPD8/CSCU9YO Multiobjective Optimisation An Overview Nadarajen Veerapen (nve@cs.stir.ac.uk) University of Stirling Why? Classic optimisation: 1 objective Example: Minimise cost Reality is often more

More information

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms.

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Gómez-Skarmeta, A.F. University of Murcia skarmeta@dif.um.es Jiménez, F. University of Murcia fernan@dif.um.es

More information

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

More information

Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances

Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances Minzhong Liu, Xiufen Zou, Yu Chen, Zhijian Wu Abstract In this paper, the DMOEA-DD, which is an improvement of DMOEA[1,

More information

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications

More information

Using an outward selective pressure for improving the search quality of the MOEA/D algorithm

Using an outward selective pressure for improving the search quality of the MOEA/D algorithm Comput Optim Appl (25) 6:57 67 DOI.7/s589-5-9733-9 Using an outward selective pressure for improving the search quality of the MOEA/D algorithm Krzysztof Michalak Received: 2 January 24 / Published online:

More information

ADAPTIVE TILE CODING METHODS FOR THE GENERALIZATION OF VALUE FUNCTIONS IN THE RL STATE SPACE A THESIS SUBMITTED TO THE FACULTY OF THE GRADUATE SCHOOL

ADAPTIVE TILE CODING METHODS FOR THE GENERALIZATION OF VALUE FUNCTIONS IN THE RL STATE SPACE A THESIS SUBMITTED TO THE FACULTY OF THE GRADUATE SCHOOL ADAPTIVE TILE CODING METHODS FOR THE GENERALIZATION OF VALUE FUNCTIONS IN THE RL STATE SPACE A THESIS SUBMITTED TO THE FACULTY OF THE GRADUATE SCHOOL OF THE UNIVERSITY OF MINNESOTA BY BHARAT SIGINAM IN

More information

Pseudo-code for typical EA

Pseudo-code for typical EA Extra Slides for lectures 1-3: Introduction to Evolutionary algorithms etc. The things in slides were more or less presented during the lectures, combined by TM from: A.E. Eiben and J.E. Smith, Introduction

More information

Incorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms

Incorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms H. Ishibuchi, T. Doi, and Y. Nojima, Incorporation of scalarizing fitness functions into evolutionary multiobjective optimization algorithms, Lecture Notes in Computer Science 4193: Parallel Problem Solving

More information

Assessing the Convergence Properties of NSGA-II for Direct Crashworthiness Optimization

Assessing the Convergence Properties of NSGA-II for Direct Crashworthiness Optimization 10 th International LS-DYNA Users Conference Opitmization (1) Assessing the Convergence Properties of NSGA-II for Direct Crashworthiness Optimization Guangye Li 1, Tushar Goel 2, Nielen Stander 2 1 IBM

More information

Approximation Model Guided Selection for Evolutionary Multiobjective Optimization

Approximation Model Guided Selection for Evolutionary Multiobjective Optimization Approximation Model Guided Selection for Evolutionary Multiobjective Optimization Aimin Zhou 1, Qingfu Zhang 2, and Guixu Zhang 1 1 Each China Normal University, Shanghai, China 2 University of Essex,

More information

Comparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach

Comparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach Comparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach Hisao Ishibuchi Graduate School of Engineering Osaka Prefecture University Sakai, Osaka 599-853,

More information

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS 6.1 Introduction Gradient-based algorithms have some weaknesses relative to engineering optimization. Specifically, it is difficult to use gradient-based algorithms

More information

Second Edition. Concept Builders. Jana Kohout

Second Edition. Concept Builders. Jana Kohout Second Edition Concept Builders Jana Kohout First published in Australia as an online resource in 016. Edited and printed in 017. Jana Kohout 017 Reproduction and Communication for educational purposes

More information

6.895 Final Project: Serial and Parallel execution of Funnel Sort

6.895 Final Project: Serial and Parallel execution of Funnel Sort 6.895 Final Project: Serial and Parallel execution of Funnel Sort Paul Youn December 17, 2003 Abstract The speed of a sorting algorithm is often measured based on the sheer number of calculations required

More information

Genetic Algorithms Applied to the Knapsack Problem

Genetic Algorithms Applied to the Knapsack Problem Genetic Algorithms Applied to the Knapsack Problem Christopher Queen Department of Mathematics Saint Mary s College of California Moraga, CA Essay Committee: Professor Sauerberg Professor Jones May 16,

More information

Chapter 15 Introduction to Linear Programming

Chapter 15 Introduction to Linear Programming Chapter 15 Introduction to Linear Programming An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Brief History of Linear Programming The goal of linear programming is to determine the values of

More information

DEMO: Differential Evolution for Multiobjective Optimization

DEMO: Differential Evolution for Multiobjective Optimization DEMO: Differential Evolution for Multiobjective Optimization Tea Robič and Bogdan Filipič Department of Intelligent Systems, Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana, Slovenia tea.robic@ijs.si

More information

What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool

What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool Lecture 5: GOSET 1 What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool GOSET is a MATLAB based genetic algorithm toolbox for solving optimization problems 2 GOSET Features Wide

More information

Modern Robots: Evolutionary Robotics

Modern Robots: Evolutionary Robotics Modern Robots: Evolutionary Robotics Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory News Congratulations to Roby, Joost, Henok, and Arash! Genotypic vs. Phenotypic (vs. Behavioral)

More information

Genetic Programming Prof. Thomas Bäck Nat Evur ol al ut ic o om nar put y Aling go rg it roup hms Genetic Programming 1

Genetic Programming Prof. Thomas Bäck Nat Evur ol al ut ic o om nar put y Aling go rg it roup hms Genetic Programming 1 Genetic Programming Prof. Thomas Bäck Natural Evolutionary Computing Algorithms Group Genetic Programming 1 Genetic programming The idea originated in the 1950s (e.g., Alan Turing) Popularized by J.R.

More information

INTERACTIVE MULTI-OBJECTIVE GENETIC ALGORITHMS FOR THE BUS DRIVER SCHEDULING PROBLEM

INTERACTIVE MULTI-OBJECTIVE GENETIC ALGORITHMS FOR THE BUS DRIVER SCHEDULING PROBLEM Advanced OR and AI Methods in Transportation INTERACTIVE MULTI-OBJECTIVE GENETIC ALGORITHMS FOR THE BUS DRIVER SCHEDULING PROBLEM Jorge PINHO DE SOUSA 1, Teresa GALVÃO DIAS 1, João FALCÃO E CUNHA 1 Abstract.

More information

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II Genetic Algorithms Genetic Algorithms Iterative method for doing optimization Inspiration from biology General idea (see Pang or Wikipedia for more details): Create a collection of organisms/individuals

More information

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP Wael Raef Alkhayri Fahed Al duwairi High School Aljabereyah, Kuwait Suhail Sami Owais Applied Science Private University Amman,

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

Register Allocation via Hierarchical Graph Coloring

Register Allocation via Hierarchical Graph Coloring Register Allocation via Hierarchical Graph Coloring by Qunyan Wu A THESIS Submitted in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN COMPUTER SCIENCE MICHIGAN TECHNOLOGICAL

More information

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO., MONTH YEAR 1

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO., MONTH YEAR 1 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO., MONTH YEAR 1 An Efficient Approach to Non-dominated Sorting for Evolutionary Multi-objective Optimization Xingyi Zhang, Ye Tian, Ran Cheng, and

More information

STUDY OF MULTI-OBJECTIVE OPTIMIZATION AND ITS IMPLEMENTATION USING NSGA-II

STUDY OF MULTI-OBJECTIVE OPTIMIZATION AND ITS IMPLEMENTATION USING NSGA-II STUDY OF MULTI-OBJECTIVE OPTIMIZATION AND ITS IMPLEMENTATION USING NSGA-II A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Bachelor of Technology in Electrical Engineering.

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Part 10: Genetic Algorithm Basics Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic

More information

Improved Crowding Distance for NSGA-II

Improved Crowding Distance for NSGA-II Improved Crowding Distance for NSGA-II Xiangxiang Chu and Xinjie Yu Department of Electrical Engineering, Tsinghua University, Beijing84, China Abstract:Non-dominated sorting genetic algorithm II (NSGA-II)

More information

GENETIC ALGORITHM with Hands-On exercise

GENETIC ALGORITHM with Hands-On exercise GENETIC ALGORITHM with Hands-On exercise Adopted From Lecture by Michael Negnevitsky, Electrical Engineering & Computer Science University of Tasmania 1 Objective To understand the processes ie. GAs Basic

More information

Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA

Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA Kalyanmoy Deb, Amrit Pratap, and Subrajyoti Moitra Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute

More information

NCGA : Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems

NCGA : Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems : Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems Shinya Watanabe Graduate School of Engineering, Doshisha University 1-3 Tatara Miyakodani,Kyo-tanabe, Kyoto, 10-031,

More information

Multi-Objective Memetic Algorithm using Pattern Search Filter Methods

Multi-Objective Memetic Algorithm using Pattern Search Filter Methods Multi-Objective Memetic Algorithm using Pattern Search Filter Methods F. Mendes V. Sousa M.F.P. Costa A. Gaspar-Cunha IPC/I3N - Institute of Polymers and Composites, University of Minho Guimarães, Portugal

More information

A Search Method with User s Preference Direction using Reference Lines

A Search Method with User s Preference Direction using Reference Lines A Search Method with User s Preference Direction using Reference Lines Tomohiro Yoshikawa Graduate School of Engineering, Nagoya University, Nagoya, Japan, {yoshikawa}@cse.nagoya-u.ac.jp Abstract Recently,

More information

Grid-Based Genetic Algorithm Approach to Colour Image Segmentation

Grid-Based Genetic Algorithm Approach to Colour Image Segmentation Grid-Based Genetic Algorithm Approach to Colour Image Segmentation Marco Gallotta Keri Woods Supervised by Audrey Mbogho Image Segmentation Identifying and extracting distinct, homogeneous regions from

More information

Averages and Variation

Averages and Variation Averages and Variation 3 Copyright Cengage Learning. All rights reserved. 3.1-1 Section 3.1 Measures of Central Tendency: Mode, Median, and Mean Copyright Cengage Learning. All rights reserved. 3.1-2 Focus

More information

EMO A Real-World Application of a Many-Objective Optimisation Complexity Reduction Process

EMO A Real-World Application of a Many-Objective Optimisation Complexity Reduction Process EMO 2013 A Real-World Application of a Many-Objective Optimisation Complexity Reduction Process Robert J. Lygoe, Mark Cary, and Peter J. Fleming 22-March-2013 Contents Introduction Background Process Enhancements

More information

Reference Point Based Evolutionary Approach for Workflow Grid Scheduling

Reference Point Based Evolutionary Approach for Workflow Grid Scheduling Reference Point Based Evolutionary Approach for Workflow Grid Scheduling R. Garg and A. K. Singh Abstract Grid computing facilitates the users to consume the services over the network. In order to optimize

More information

Classification of Optimization Problems and the Place of Calculus of Variations in it

Classification of Optimization Problems and the Place of Calculus of Variations in it Lecture 1 Classification of Optimization Problems and the Place of Calculus of Variations in it ME256 Indian Institute of Science G. K. Ananthasuresh Professor, Mechanical Engineering, Indian Institute

More information

Computational Intelligence

Computational Intelligence Computational Intelligence Winter Term 2017/18 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund Slides prepared by Dr. Nicola Beume (2012) enriched

More information

Evolutionary Multi-objective Optimization of Business Process Designs with Pre-processing

Evolutionary Multi-objective Optimization of Business Process Designs with Pre-processing Evolutionary Multi-objective Optimization of Business Process Designs with Pre-processing Kostas Georgoulakos Department of Applied Informatics University of Macedonia Thessaloniki, Greece mai16027@uom.edu.gr

More information

Scuola Politecnica DIME

Scuola Politecnica DIME Scuola Politecnica DIME Ingegneria Meccanica - Energia e Aeronautica Anno scolastico 2017-2018 Fluidodinamica Avanzata Aircraft S-shaped duct geometry optimization Professor Jan Pralits Supervisor Joel

More information

SPEA2+: Improving the Performance of the Strength Pareto Evolutionary Algorithm 2

SPEA2+: Improving the Performance of the Strength Pareto Evolutionary Algorithm 2 SPEA2+: Improving the Performance of the Strength Pareto Evolutionary Algorithm 2 Mifa Kim 1, Tomoyuki Hiroyasu 2, Mitsunori Miki 2, and Shinya Watanabe 3 1 Graduate School, Department of Knowledge Engineering

More information

Review: Final Exam CPSC Artificial Intelligence Michael M. Richter

Review: Final Exam CPSC Artificial Intelligence Michael M. Richter Review: Final Exam Model for a Learning Step Learner initially Environm ent Teacher Compare s pe c ia l Information Control Correct Learning criteria Feedback changed Learner after Learning Learning by

More information

Multi-Objective Optimization using Evolutionary Algorithms

Multi-Objective Optimization using Evolutionary Algorithms Multi-Objective Optimization using Evolutionary Algorithms Kalyanmoy Deb Department of Mechanical Engineering, Indian Institute of Technology, Kanpur, India JOHN WILEY & SONS, LTD Chichester New York Weinheim

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Approximation Algorithms and Heuristics November 21, 2016 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff Inria Saclay Ile-de-France 2 Exercise: The Knapsack

More information

Space Filling Curves and Hierarchical Basis. Klaus Speer

Space Filling Curves and Hierarchical Basis. Klaus Speer Space Filling Curves and Hierarchical Basis Klaus Speer Abstract Real world phenomena can be best described using differential equations. After linearisation we have to deal with huge linear systems of

More information

Information Fusion Dr. B. K. Panigrahi

Information Fusion Dr. B. K. Panigrahi Information Fusion By Dr. B. K. Panigrahi Asst. Professor Department of Electrical Engineering IIT Delhi, New Delhi-110016 01/12/2007 1 Introduction Classification OUTLINE K-fold cross Validation Feature

More information

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Evolutionary Algorithms. CS Evolutionary Algorithms 1 Evolutionary Algorithms CS 478 - Evolutionary Algorithms 1 Evolutionary Computation/Algorithms Genetic Algorithms l Simulate natural evolution of structures via selection and reproduction, based on performance

More information

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS In: Journal of Applied Statistical Science Volume 18, Number 3, pp. 1 7 ISSN: 1067-5817 c 2011 Nova Science Publishers, Inc. MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS Füsun Akman

More information

Database Management System Prof. D. Janakiram Department of Computer Science & Engineering Indian Institute of Technology, Madras Lecture No.

Database Management System Prof. D. Janakiram Department of Computer Science & Engineering Indian Institute of Technology, Madras Lecture No. Database Management System Prof. D. Janakiram Department of Computer Science & Engineering Indian Institute of Technology, Madras Lecture No. # 20 Concurrency Control Part -1 Foundations for concurrency

More information

Theorem 2.9: nearest addition algorithm

Theorem 2.9: nearest addition algorithm There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used

More information

Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design

Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design City University of New York (CUNY) CUNY Academic Works International Conference on Hydroinformatics 8-1-2014 Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design Matthew

More information

A GENETIC ALGORITHM APPROACH FOR TECHNOLOGY CHARACTERIZATION. A Thesis EDGAR GALVAN

A GENETIC ALGORITHM APPROACH FOR TECHNOLOGY CHARACTERIZATION. A Thesis EDGAR GALVAN A GENETIC ALGORITHM APPROACH FOR TECHNOLOGY CHARACTERIZATION A Thesis by EDGAR GALVAN Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Approximation Algorithms and Heuristics November 6, 2015 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff INRIA Lille Nord Europe 2 Exercise: The Knapsack Problem

More information

Finding a preferred diverse set of Pareto-optimal solutions for a limited number of function calls

Finding a preferred diverse set of Pareto-optimal solutions for a limited number of function calls Finding a preferred diverse set of Pareto-optimal solutions for a limited number of function calls Florian Siegmund, Amos H.C. Ng Virtual Systems Research Center University of Skövde P.O. 408, 541 48 Skövde,

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

Computer Graphics Prof. Sukhendu Das Dept. of Computer Science and Engineering Indian Institute of Technology, Madras Lecture - 14

Computer Graphics Prof. Sukhendu Das Dept. of Computer Science and Engineering Indian Institute of Technology, Madras Lecture - 14 Computer Graphics Prof. Sukhendu Das Dept. of Computer Science and Engineering Indian Institute of Technology, Madras Lecture - 14 Scan Converting Lines, Circles and Ellipses Hello everybody, welcome again

More information

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? Gurjit Randhawa Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? This would be nice! Can it be done? A blind generate

More information

Multi-Objective Optimization using Evolutionary Algorithms

Multi-Objective Optimization using Evolutionary Algorithms Multi-Objective Optimization using Evolutionary Algorithms Kalyanmoy Deb Department ofmechanical Engineering, Indian Institute of Technology, Kanpur, India JOHN WILEY & SONS, LTD Chichester New York Weinheim

More information

Genetic Algorithms. Kang Zheng Karl Schober

Genetic Algorithms. Kang Zheng Karl Schober Genetic Algorithms Kang Zheng Karl Schober Genetic algorithm What is Genetic algorithm? A genetic algorithm (or GA) is a search technique used in computing to find true or approximate solutions to optimization

More information

Solving Multi-objective Optimisation Problems Using the Potential Pareto Regions Evolutionary Algorithm

Solving Multi-objective Optimisation Problems Using the Potential Pareto Regions Evolutionary Algorithm Solving Multi-objective Optimisation Problems Using the Potential Pareto Regions Evolutionary Algorithm Nasreddine Hallam, Graham Kendall, and Peter Blanchfield School of Computer Science and IT, The Univeristy

More information

Lecture 2: Analyzing Algorithms: The 2-d Maxima Problem

Lecture 2: Analyzing Algorithms: The 2-d Maxima Problem Lecture 2: Analyzing Algorithms: The 2-d Maxima Problem (Thursday, Jan 29, 1998) Read: Chapter 1 in CLR. Analyzing Algorithms: In order to design good algorithms, we must first agree the criteria for measuring

More information

THE preceding chapters were all devoted to the analysis of images and signals which

THE preceding chapters were all devoted to the analysis of images and signals which Chapter 5 Segmentation of Color, Texture, and Orientation Images THE preceding chapters were all devoted to the analysis of images and signals which take values in IR. It is often necessary, however, to

More information

Multicriterial Optimization Using Genetic Algorithm

Multicriterial Optimization Using Genetic Algorithm Multicriterial Optimization Using Genetic Algorithm 180 175 170 165 Fitness 160 155 150 145 140 Best Fitness Mean Fitness 135 130 0 Page 1 100 200 300 Generations 400 500 600 Contents Optimization, Local

More information

Genetic Programming. and its use for learning Concepts in Description Logics

Genetic Programming. and its use for learning Concepts in Description Logics Concepts in Description Artificial Intelligence Institute Computer Science Department Dresden Technical University May 29, 2006 Outline Outline: brief introduction to explanation of the workings of a algorithm

More information

CS5401 FS Solving NP-Complete Light Up Puzzle

CS5401 FS Solving NP-Complete Light Up Puzzle CS5401 FS2018 - Solving NP-Complete Light Up Puzzle Daniel Tauritz, Ph.D. September 3, 2018 Synopsis The goal of this assignment set is for you to become familiarized with (I) representing problems in

More information

Multi-Objective Optimization Using Genetic Algorithms

Multi-Objective Optimization Using Genetic Algorithms Multi-Objective Optimization Using Genetic Algorithms Mikhail Gaerlan Computational Physics PH 4433 December 8, 2015 1 Optimization Optimization is a general term for a type of numerical problem that involves

More information

9/29/2016. Chapter 4 Trees. Introduction. Terminology. Terminology. Terminology. Terminology

9/29/2016. Chapter 4 Trees. Introduction. Terminology. Terminology. Terminology. Terminology Introduction Chapter 4 Trees for large input, even linear access time may be prohibitive we need data structures that exhibit average running times closer to O(log N) binary search tree 2 Terminology recursive

More information

CHAPTER 4 GENETIC ALGORITHM

CHAPTER 4 GENETIC ALGORITHM 69 CHAPTER 4 GENETIC ALGORITHM 4.1 INTRODUCTION Genetic Algorithms (GAs) were first proposed by John Holland (Holland 1975) whose ideas were applied and expanded on by Goldberg (Goldberg 1989). GAs is

More information

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA The Binary Genetic Algorithm Universidad de los Andes-CODENSA 1. Genetic Algorithms: Natural Selection on a Computer Figure 1 shows the analogy between biological i l evolution and a binary GA. Both start

More information

Greedy Algorithms CHAPTER 16

Greedy Algorithms CHAPTER 16 CHAPTER 16 Greedy Algorithms In dynamic programming, the optimal solution is described in a recursive manner, and then is computed ``bottom up''. Dynamic programming is a powerful technique, but it often

More information

Effectiveness and efficiency of non-dominated sorting for evolutionary multi- and many-objective optimization

Effectiveness and efficiency of non-dominated sorting for evolutionary multi- and many-objective optimization Complex Intell. Syst. (217) 3:247 263 DOI 1.17/s4747-17-57-5 ORIGINAL ARTICLE Effectiveness and efficiency of non-dominated sorting for evolutionary multi- and many-objective optimization Ye Tian 1 Handing

More information

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution

More information

An Interactive Evolutionary Multi-Objective Optimization Method Based on Progressively Approximated Value Functions

An Interactive Evolutionary Multi-Objective Optimization Method Based on Progressively Approximated Value Functions An Interactive Evolutionary Multi-Objective Optimization Method Based on Progressively Approximated Value Functions Kalyanmoy Deb, Ankur Sinha, Pekka Korhonen, and Jyrki Wallenius KanGAL Report Number

More information

Recombination of Similar Parents in EMO Algorithms

Recombination of Similar Parents in EMO Algorithms H. Ishibuchi and K. Narukawa, Recombination of parents in EMO algorithms, Lecture Notes in Computer Science 341: Evolutionary Multi-Criterion Optimization, pp. 265-279, Springer, Berlin, March 25. (Proc.

More information

Evolutionary Linkage Creation between Information Sources in P2P Networks

Evolutionary Linkage Creation between Information Sources in P2P Networks Noname manuscript No. (will be inserted by the editor) Evolutionary Linkage Creation between Information Sources in P2P Networks Kei Ohnishi Mario Köppen Kaori Yoshida Received: date / Accepted: date Abstract

More information

CHAPTER 2 MULTI-OBJECTIVE REACTIVE POWER OPTIMIZATION

CHAPTER 2 MULTI-OBJECTIVE REACTIVE POWER OPTIMIZATION 19 CHAPTER 2 MULTI-OBJECTIE REACTIE POWER OPTIMIZATION 2.1 INTRODUCTION In this chapter, a fundamental knowledge of the Multi-Objective Optimization (MOO) problem and the methods to solve are presented.

More information

Data Partitioning. Figure 1-31: Communication Topologies. Regular Partitions

Data Partitioning. Figure 1-31: Communication Topologies. Regular Partitions Data In single-program multiple-data (SPMD) parallel programs, global data is partitioned, with a portion of the data assigned to each processing node. Issues relevant to choosing a partitioning strategy

More information

CS348 FS Solving NP-Complete Light Up Puzzle

CS348 FS Solving NP-Complete Light Up Puzzle CS348 FS2013 - Solving NP-Complete Light Up Puzzle Daniel Tauritz, Ph.D. October 7, 2013 Synopsis The goal of this assignment set is for you to become familiarized with (I) representing problems in mathematically

More information

Multiobjective Prototype Optimization with Evolved Improvement Steps

Multiobjective Prototype Optimization with Evolved Improvement Steps Multiobjective Prototype Optimization with Evolved Improvement Steps Jiri Kubalik 1, Richard Mordinyi 2, and Stefan Biffl 3 1 Department of Cybernetics Czech Technical University in Prague Technicka 2,

More information

Principles of Algorithm Design

Principles of Algorithm Design Principles of Algorithm Design When you are trying to design an algorithm or a data structure, it s often hard to see how to accomplish the task. The following techniques can often be useful: 1. Experiment

More information