A motivated definition of exploitation and exploration

Size: px
Start display at page:

Download "A motivated definition of exploitation and exploration"

Transcription

1 A motivated definition of exploitation and exploration Bart Naudts and Adriaan Schippers Technical report at the University of Antwerp, Belgium. 1 INTRODUCTION The terms exploration and exploitation are as old as evolutionary algorithms themselves, yet few papers are truly devoted to this subject (e.g., [1, 2, 3, 4, 5]). There is even less agreement on the meaning of the terms. This paper can be seen as an exercise in formal reasoning to provide a well-understood, unambiguous definition of the terms exploitation and exploration. It is not our aim to distill a common denominator from the current usage of the two terms in the literature we refer to [3] for a detailed overview. Instead, we start from a simple algorithm framework and try to give a precise definition of exploitation and exploration in this context. Then we discuss these definitions in great depth. Population-based algorithms and algorithms with any sort of memory remain out of the scope of this paper, and will be considered in further contributions. The structure of the paper is as follows. In the second section, we set our context: a framework of simple evolutionary algorithms without population or memory. Then, in the third section, we define exploitation and exploration in the context of this framework. We proceed with examples (section 4) and an attempt at quantifying the different forms of exploitation and exploration encountered (section 5). The sixth section discusses some of the differences of our approach and the definitions found in the literature. We conclude this paper with a summary and some words on further work. 2 THE ALGORITHM FRAMEWORK In this section we formally define an algorithm framework which will be used in the next section as a context for defining the terms exploitation and exploration. Examples of algorithm instances of this framework are given in section 4. Note that since we only consider algorithms without memory and population, it is artificial to fit genetic algorithms in the framework by interpreting them as single-element algorithms in population space. Assume we are given a non-empty, finite search space S, whose elements we denote by s S. Implicitly associated with each element is a unique representation of this element. A neighborhood N(s) is defined in each element as a probability distribution on the set S. We require a representational relationship between an element and its neighborhood in terms of likelihood, i.e., elements with higher probability should be more related; that not all mass of the distribution is given to the element itself; that the neighborhoods are static, in the sense that they do not change during the run of the algorithm. The collection of neighborhoods is called the neighborhood structure. With each element s S we also associate an objective value f (s) R. To give a non-trivial meaning to the terms exploration and exploitation, the fundamental assumption of black-box optimization has to be satisfied: there must exist a (possibly unknown yet exploitable 1 ) relationship between the neighborhood structure and the objective function. The goal of an algorithm instance of the framework is to find an element satisfying a given stopping criterion which is dependent on objective values only. The algorithm framework is then defined as follows: 1. Choose an initial element s 0 S, and evaluate it to obtain f (s 0 ). Initialize the step counter i to If f(s i ) satisfies the stopping criterion, stop the algorithm. 1 Haha.

2 3. Using s i, generate the neighborhood N(s i ). 4. Construct a set T S by drawing elements from the distribution N(s i ) and compute the objective values of the elements of T. 5. Given only f (s i ) and f (t) for all t T, select one element t from T {s i }. 6. Accept t as s i+1, the next element of the algorithm. 7. Increase i, and go to step 2. Note that the step counter i is not used as a memory. One could as well use a 0/1 flipper as a counter. We could easily have done without step 3, since the neighborhood N(s i ) is defined for s i anyway. Instead of this mathematical approach we favor the constructive approach, which stresses the fact that the element is used to obtain the neighborhood. This approach will also be more suitable for the definitions in the next section. Of course, in an instance of this framework, not all elements of the neighborhood have to be generated or constructed physically. The construction of those of T would suffice. The selection of the subset T in step 4 can be deterministic or stochastic, but it cannot be biased by the representation of the elements of the neighborhood. Biasing the selection on the basis of the representation makes no sense because the neighborhoods are static and the algorithm has no memory. It would have the same effect as choosing different neighborhoods. By construction of the framework, any problem specific knowledge can only be put in either the neighborhood structure or the objective function. At the risk of being too restrictive, we have chosen to make the selection of the successor in step 5 fully representation independent. Given that step 3 is the only other step where information is used, we thus achieve a clear separation between the use of representational (step 3) information and objective information (step 5). 3 DEFINITION OF EXPLOITATION AND EXPLORATION It is clear that the important steps of the algorithm are steps 3 up to Step 3 Step 3 is the explicit restriction of candidate successors to the (purely representational) neighborhood of the current element. Relying on the fundamental assumption of blackbox optimization, we postulate that the action of step 3 is exploitative: the representation of the current element is used to generate candidate successors. This form of exploitation will be called representational exploitation. The probability distributions can be uniform over the whole space (e.g., random search, section 4.5), uniform over the Hamming cube (e.g., the single-bit-flip hill-climber, section 4.1), Gaussian (e.g., the (1+1)-EA on a real space, section 4.4), etc. 3.2 Step 4 Step 4 contains two actions: the selection of a subset for evaluation and the actual evaluation of the elements in the subset. As explained in the previous section, the selection of the set T is a process which is independent of the representation of the elements in the neighborhood. Examples are: always select all elements with non-zero probability (e.g., the single-bit-flip hill-climber, section 4.1), pick one element according to the distribution (e.g., the Metropolis algorithm, section 4.2 or random search, section 4.5), etc. We postulate that the actions of step 4 are explorative: new elements are selected for evaluation and then evaluated. Since this exploration is restricted to only a neighborhood by representational exploitation, we call it neighborhood exploration. 3.3 Step 5 In step 5, the information obtained by the neighborhood exploration (step 4) is used to select a successor from the candidate elements. We postulate this to be exploitative, and we call this form of exploitation objective exploitation, since only objective information is used here. 3.4 Step 6 Step 6 is the acceptance of the selected element as the successor. It is postulated to be explorative, since it moves the algorithm from one known element to a (known element) with an unknown neighborhood. This action is called generational exploration. When the current element s i is selected to become the next element, we say that there was no generational exploration. 3.5 Comments Note that we do not explicitly define exploitation and exploration in terms of operators like mutation and selection, although steps 3 and 4 together could be interpreted as a mutation and steps 5 as a selection operation. Neither do we relate exploitation and exploration to entities at sub-element level, because this can easily lead to problem dependent definitions and statements. For example,

3 we do not relate exploration to the recombination or generation of building blocks. When restricted to specific problem classes, the sub-element approach can give promising results (see [3]). 4 EXAMPLES In this section we consider some examples which fit the framework of section 2. Recall that the genetic algorithms only fit artificially in the framework. For this reason, they are omitted here. 4.1 A single-bit-flip hill-climber Consider a single-bit-flip hill-climber on the space of binary strings of length l. Clearly, S = {0, 1} l. We define the neighborhood of a string s S to consists of all strings at Hamming distance 1 from s (the Hamming cube of s). This is done by setting the probability of all elements of the Hamming cube to 1/l, and the probability of all other elements of S to 0. Representational exploitation is thus the restriction of the action radius of the hill-climber to the Hamming cube. The objective function f is the fitness function to be minimized or maximized. We define this hill-climber to perform full neighborhood exploration, i.e., all elements of the neighborhood with non-zero probability are evaluated before step 5 is started. Objective exploitation is only concerned with objective values: the element (or one of the elements) with the highest objective value is chosen to become the successor. When a single-bit-flip hill-climber is stuck in a local optimum, either a string with a worse fitness value is accepted, or the algorithm is restarted. The first option simply requires a different type of objective exploitation. The second option requires the representational exploitation (step 3) to be performed using distribution which is uniform on the whole space S. To summarize, the single-bit-flip hill-climber performs: representational exploitation: restriction of the search direction to the Hamming cube; neighborhood exploration: all elements are selected and evaluated; objective exploitation: select the best element from the cube; generational exploration: always accept the element. 4.2 Metropolis algorithm The Metropolis algorithm [6] is the core algorithm of simulated annealing. It contains everything but the annealing scheme: the temperature (or inverse temperature β) is fixed to a given value. Again we choose the space of binary strings of length l (S = {0, 1} l ). We retain the Hamming cube as the neighborhood of an element as in the previous section and take a fitness function f as objective function. As opposed to our single-bit-flip hill-climber, the Metropolis algorithm does not select its whole neighborhood for evaluation, but it simply draws one element from the uniform distribution. Hence, while representational exploitation remains the same, neighborhood exploration contains a stochastic component which selects one element from the neighborhood. (Note that this corresponds to the arbitrary selection of one bit to be flipped.) On the objective side, we choose to let exploitation make the acceptance decision, with generational exploration in case of acceptance and no generational exploration in case of rejection. To summarize, the Metropolis algorithm performs: representational exploitation: restriction of the search direction to the Hamming cube; neighborhood exploration: selection of one element out of the neighborhood and evaluation of this element; objective exploitation: acceptance decision based on the difference in objective value of the current and the new element; no representational selection; generational exploration: accept the new element in case of a positive acceptance decision; otherwise reject: no generational exploration. 4.3 A 1/l-mutation hill-climber A 1/l-mutation hill-climber can be constructed in many variants, but the new aspect is its neighborhood structure. As an operator, this mutation loops over the bits and flips them with probability 1/l, with l the length of the strings. In the neighborhood of s, the probability of an element t thus equals P(t) = ( ) 1 d ( 1 1 l d, (1) l l) with d the Hamming distance between s and t. The set T can then be composed in step 4 by drawing one or more elements from this probability distribution. 4.4 A strictly elitist (1+1)-EA on the reals Let us now give an example of a search algorithm on an n-dimensional Euclidean space. Let S = R n, and let

4 f : R n R be the objective function. The neighborhood of a point s S is given by a n-dimensional Gaussian distribution with mean s and variance σ R n +. Only one element is drawn from the neighborhood in step 4, and step 5 performs a strict elitist selection: only if the objective value of the new element is strictly greater than the objective value of the current element, the new element is accepted. To summarize, our (1+1)-EA performs: representational exploitation: restriction of the search direction to a Gaussian distribution around the current point; neighborhood exploration: selection of one element out of the neighborhood and evaluation of this element; objective exploitation: acceptance decision based on a strictly elitist scheme; generational exploration: accept the new element in case of a positive acceptance decision; otherwise reject: no generational exploration. Note that we avoid ESs where the mutation rate (and hence the variance of the Gaussian) is a part of the element, because we would have to give up the requirement of a static neighborhood structure. Implicitly, we would also add the capacity of a memory to the algorithm. 4.5 Random search We have seen in section 4.1 that performing representational exploitation using a uniform distribution over the whole space can be used to perform a restart. Permanently doing this results in a random search algorithm. Note that this type of representational exploitation violates our requirement of having a relation between the representation of an element and the probable elements of its neighborhood. We say that representational exploitation is disabled or inactive. A consequence of having representational exploitation permanently disabled, is that objective exploitation becomes meaningless, since there is no motivation for preferring one element over any of the other generated elements. This does not hold the other way round: if objective exploitation is permanently disabled (i.e., an element is chosen to be the successor without using representational or objective information), one could still implement a random walker or an enumerative algorithm (see section 4.6). To summarize, pure random search performs: no representational exploitation: elements are chosen stochastically from the whole search space; neighborhood exploration: evaluation of the selected elements; no objective exploitation; no grounds for making decisions; generational exploration without meaning: it is unimportant where the algorithm is; 4.6 Random walkers and enumerative algorithms A random walker can be constructed most easily by taking the single-bit-flip hill-climber, picking one element in step 4, and always selecting this new element in step 5. The same effect is achieved by taking the single-bit-flip hillclimber again and performing a stochastic selection in step 5 which is independent of the objective values. Note that in the latter case, more function evaluations are performed. The only effect, of course, is that the algorithm is slowed down. When one replaces the word stochastically by deterministically throughout the previous paragraph, one can easily construct a simple enumerative algorithm (a brute-force one that searches the whole space in a deterministic way). Let us summarize the actions performed by the random walker or the enumerative algorithm: representational exploitation: restriction of the search direction to the Hamming cube (in the case of a random walker) or some very restricted neighborhood (in the case of an enumerative technique); neighborhood exploration: stochastic or deterministic selection of one element from the neighborhood, and evaluation of this element; inactive objective exploitation: simply select the new element; generational exploration: always accept the new element. Note that the use of the terminology of spaces of bit-strings in this example is done without loss of generality. 5 A QUANTIFICATION In this section we give some hints related to the quantification of the different forms of exploitation and exploration. In the case of exploration, we can give formal definitions, but as long as none are given for exploitation, they have to be used with extreme care.

5 5.1 Representational exploitation As the name indicates, a quantification of representational exploitation plays entirely at the representational level: no objective values are involved. To start the discussion, consider the following two extreme examples: the Hamming cube as a neighborhood structure, and the unbiased (uniform) generation of arbitrary elements (i.e., inactive representational exploitation). In the first case, there is a strong representational relation between the current element and the elements of its neighborhood, while in the second case the elements are by definition representationally unrelated. These examples indicate that the correlation between the representation of the current element and that of the elements of its neighborhood could well be a first measure of neighborhood exploitation. Put differently: there is more representational exploitation when the neighborhood plays a more restrictive role. In this way, the single-bit-flip hill-climber (section 4.1) would perform more representational exploitation than the 1/l-mutation hill-climber (section 4.3). An alternative direction one might take is related to the intuitive use of the current element to generate the elements of the neighborhood. In this respect, both types of hillclimbers would perform an equal amount of representational exploitation, since the elements of T are generated in exactly the same way. Finally, note that to obtain a formal quantification, one has to realize that the only metric (in the topological sense) defined on the search space S is the one induced by the neighborhood structure. In the case of the 1/l-mutation hillclimber, this metric is clearly the Hamming distance. The neighborhood structure of the single-bit-flip hill-climber can less immediately be extended to a metric, and it is not clear if the result should also be the Hamming distance. 5.2 Neighborhood exploration The amount of neighborhood exploration can be summarized by counting the number of distinct elements which are evaluated. This is well-defined, easily computable and intuitively correct. It also corresponds to the definition in the literature that the explorative power of an operator is given by the number of new elements it can generate [2]. Note that neighborhood exploration is entirely independent of success. The amount of exploration is not influenced by the objective values associated with the explored elements. Of course, the quality of exploration (e.g., does it lead to an interesting area?) is much harder to quantify (if possible at all), and can only be done on a problem-specific basis. In the current algorithm framework, the amount of neighborhood exploration is strongly influenced by the neighborhood structure, which is static. In population-based algorithms, on the other hand, more interesting phenomena can occur, since the amount of neighborhood exploration can be directly related to the diversity of the population. 5.3 Objective exploitation Next, we try to quantify objective exploitation. In this phase we only deal with objective information (the objective values of the current element and those of the elements of a subset of its neighborhood). Let us first look at the case of the Metropolis algorithm. In the Metropolis algorithm, the element u selected from the neighborhood is accepted in step 5 of the algorithm with a probability of ( ) exp β(f(s i ) f (u)), (2) where β is the fixed inverse temperature, suitably scaled. In the case of β = 0, this probability is always 1; this is typical for inactive objective exploitation. The other extreme, β =, also corresponds to an extreme form of objective exploitation: the element with the best objective value is always selected. Conclusion: in the context of the Metropolis algorithm, the selection pressure β is a candidate quantifier of objective exploitation. This quantification holds trivially for the single-bit-flip hillclimber. Either objective exploitation is temporarily disabled to allow the hill-climber to escape from a local optimum, or it is at full strength to take the best element of the neighborhood. But selection pressure on its own might not be enough. It is easy to see that objective exploitation is restricted by the amount of information which is known at the moment of selection. The situation β = 0, inactive exploitation, is observed for any algorithm when all the objective values in the neighborhood are equal, i.e., when the algorithm is traversing a flat landscape. The quantification of objective exploitation we propose is selection pressure, influenced by the amount of objective information present. A more formal definition, though, will be deferred to a later contribution. 5.4 Generational exploration We cannot quantify generational exploration: either it happens, or it does not. One could keep a statistic of generational exploration over the run of an algorithm, and look at the percentage of acceptance of new elements. In the case of the Metropolis algorithm, this average (called the acceptance rate) is a useful metric. A interesting question now is: how independent is the acceptance rate from the selection pressure which is the proposed quantification of objective exploitation in the context of the Metropolis algorithm?

6 6 A COMPARISON In this section we compare our approach to some of the definitions found in the literature. Eiben and Schippers [3] list a number of questions and hypotheses concerning exploitation and exploration: 1. Using existing information equals exploitation. This would imply that even the use of bad material, or bad use of good information is seen as exploitation. [... ] 2. Selection is the source of exploitation. This is more or less a corollary of 1. The differences with our approach are clear. Selection is one of the sources of exploitation, not the only one. Representational exploitation is also a use of information. Moreover, we do not relate exploitation to success. 3. The operators mutation and recombination are purely explorative. [... ] In our approach, the mutation operator (which can be seen as steps 3 and 4) is also exploitative. 5. What information (on alleles, schemata, individuals) is available and which part of this information is actually exploited? [... ] This is highly problem dependent. [... ] We have chosen for problem independence and assume the fundamental assumption of black-box optimization to hold: the representation is chosen in such a way that it can guide the algorithm in its search for the optimum. Now consider Beyer [1]: Such a decomposition is of special interest if we want to quantify the relation between local exploitation and exploration. If we interpret exploitation as the ability to step into the local gradient direction and exploration as the ability to leave the gradient path then we might have a way to understand the evolutionary search behavior. Note that his definition is operator oriented (mutation on a real space), and also goal oriented, since the gradient is used to find local optima. This gradient is the only source of information used. A full comparison with our approach is difficult since a memory is needed to maintain an idea of the gradient over more than one generation. We found Beyer the first to propose a quantification of exploitation and exploration. It is also interesting to note that in his setup, new elements can be visited without exploration. This occurs when the new elements lie in the direction of the gradient. De Jong and Spears [2] take the schemata/hyper-plane approach of defining exploration, and consider mainly recombination instead of only mutation. They do not explicitly define exploitation. Eshelman, Caruana and Schaffer [4] also follow the recombination and schema/hyper-plane approach, and are most explicit in the exploitation of sub-individuals (subelements). Two quotes to situate their definitions: If a schema is present in a parent (and not its mate) and survives in the offspring, it is said to have been exploited. Otherwise the trial that the offspring represents is accounted an exploratory one (with respect to that schema). and An exploitative trial is assumed to be testing the given schema in a new setting, i.e., it is assumed that some loci outside of those that define the schema have been altered between parent and child. They do not mention objective exploitation. 7 CONCLUSIONS AND FURTHER WORK We have set a context in which we have defined exploitation and exploration rigorously. The context is simple: element-based black-box algorithms without memory and population. It contains many mutation-based hill-climbers, such as the single-bit-flip hill-climber, the Metropolis algorithm, random walkers and random search algorithms. More complex algorithms which are also more interesting in terms of exploitation and exploration will be studied in further contributions. Our algorithm framework clearly separates the use of representational information and objective information. This allows for a split definition of exploitation: exploitation of the representation and exploitation of the objective values. Exploration is also split in two: we distinguish between exploration of the neighborhood and exploration due to the evolutionary aspect of the search. Consequences of this model and its restrictions are: Exploitation is entirely identified with the use of information. The amounts of exploitation and exploration can hardly change during the run of the algorithm due to the static nature of the neighborhood structure. The restriction of static neighborhoods will be removed as soon as possible. Our setup does not allow for a definition of exploitation and exploration in terms of success, i.e., it is not goal-oriented. Neither does this framework allow for a sub-element definition of exploitation and exploration. Objective exploitation is meaningless without representational exploitation, or, put differently: fitness values can only be used in a context.

7 An important point which we have barely touched is that of quantifications of exploitation and exploration. We have given hints and some intuitive definitions, which are far from definitive. ACKNOWLEDGMENTS The first author is a research assistant of the Fund for Scientific Research Flanders (Belgium) (F.W.O.).

8 REFERENCES [1] H.-G. Beyer. On the explorative power of ES/EP-like algorithms. In V. W. Porto, N. Saravanan, D. Waagen, and A. E. Eiben, editors, Proceedings of the 7th Annual Conference on Evolutionary Programming. Springer, LNCS 1447, [2] K. A. DeJong and W. M. Spears. A formal analysis of the role of multi-point crossover in genetic algorithms. Annals of Mathematics and Artificial Intelligence, 5:1 26, [3] A.E. Eiben and C.A. Schippers. On evolutionary exploration and exploitation. Fundamentae Informaticae, 35:35 50, [4] L. Eshelman, R. A. Caruana, and J. D. Schaffer. Biases in the crossover landscape. In J. D. Schaffer, editor, Proceedings of the 3rd International Conference on Genetic Algorithms, pages Morgan Kaufmann Publishers, [5] L. J. Eshelman and J. D. Schaffer. Crossover s niche. In S. Forrest, editor, Proceedings of the 5th International Conference on Genetic Algorithms, pages Morgan Kaufmann Publishers, [6] N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. Teller, and E. Teller. Equations of state calculations by fast computing machines. Journal of Chemical Physics, 21: , 1953.

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism in Artificial Life VIII, Standish, Abbass, Bedau (eds)(mit Press) 2002. pp 182 185 1 Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism Shengxiang Yang Department of Mathematics and Computer

More information

Non-deterministic Search techniques. Emma Hart

Non-deterministic Search techniques. Emma Hart Non-deterministic Search techniques Emma Hart Why do local search? Many real problems are too hard to solve with exact (deterministic) techniques Modern, non-deterministic techniques offer ways of getting

More information

CS5401 FS2015 Exam 1 Key

CS5401 FS2015 Exam 1 Key CS5401 FS2015 Exam 1 Key This is a closed-book, closed-notes exam. The only items you are allowed to use are writing implements. Mark each sheet of paper you use with your name and the string cs5401fs2015

More information

One-Point Geometric Crossover

One-Point Geometric Crossover One-Point Geometric Crossover Alberto Moraglio School of Computing and Center for Reasoning, University of Kent, Canterbury, UK A.Moraglio@kent.ac.uk Abstract. Uniform crossover for binary strings has

More information

Similarity Templates or Schemata. CS 571 Evolutionary Computation

Similarity Templates or Schemata. CS 571 Evolutionary Computation Similarity Templates or Schemata CS 571 Evolutionary Computation Similarities among Strings in a Population A GA has a population of strings (solutions) that change from generation to generation. What

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Informed Search and Exploration Chapter 4 (4.3 4.6) Searching: So Far We ve discussed how to build goal-based and utility-based agents that search to solve problems We ve also presented

More information

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms.

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Gómez-Skarmeta, A.F. University of Murcia skarmeta@dif.um.es Jiménez, F. University of Murcia fernan@dif.um.es

More information

Evolutionary Computation for Combinatorial Optimization

Evolutionary Computation for Combinatorial Optimization Evolutionary Computation for Combinatorial Optimization Günther Raidl Vienna University of Technology, Vienna, Austria raidl@ads.tuwien.ac.at EvoNet Summer School 2003, Parma, Italy August 25, 2003 Evolutionary

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

mywbut.com Informed Search Strategies-II

mywbut.com Informed Search Strategies-II Informed Search Strategies-II 1 3.3 Iterative-Deepening A* 3.3.1 IDA* Algorithm Iterative deepening A* or IDA* is similar to iterative-deepening depth-first, but with the following modifications: The depth

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Approximation Algorithms and Heuristics November 21, 2016 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff Inria Saclay Ile-de-France 2 Exercise: The Knapsack

More information

Genetic Algorithms for Real Parameter Optimization

Genetic Algorithms for Real Parameter Optimization Genetic Algorithms for Real Parameter Optimization Alden H. Wright Department of Computer Science University of Montana Missoula, Montana 59812 Abstract This paper is concerned with the application of

More information

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Franz Rothlauf Department of Information Systems University of Bayreuth, Germany franz.rothlauf@uni-bayreuth.de

More information

Local Search. CS 486/686: Introduction to Artificial Intelligence Winter 2016

Local Search. CS 486/686: Introduction to Artificial Intelligence Winter 2016 Local Search CS 486/686: Introduction to Artificial Intelligence Winter 2016 1 Overview Uninformed Search Very general: assumes no knowledge about the problem BFS, DFS, IDS Informed Search Heuristics A*

More information

Escaping Local Optima: Genetic Algorithm

Escaping Local Optima: Genetic Algorithm Artificial Intelligence Escaping Local Optima: Genetic Algorithm Dae-Won Kim School of Computer Science & Engineering Chung-Ang University We re trying to escape local optima To achieve this, we have learned

More information

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? Gurjit Randhawa Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? This would be nice! Can it be done? A blind generate

More information

Geometric Semantic Genetic Programming ~ Theory & Practice ~

Geometric Semantic Genetic Programming ~ Theory & Practice ~ Geometric Semantic Genetic Programming ~ Theory & Practice ~ Alberto Moraglio University of Exeter 25 April 2017 Poznan, Poland 2 Contents Evolutionary Algorithms & Genetic Programming Geometric Genetic

More information

Ar#ficial)Intelligence!!

Ar#ficial)Intelligence!! Introduc*on! Ar#ficial)Intelligence!! Roman Barták Department of Theoretical Computer Science and Mathematical Logic We know how to use heuristics in search BFS, A*, IDA*, RBFS, SMA* Today: What if the

More information

Local Search. CS 486/686: Introduction to Artificial Intelligence

Local Search. CS 486/686: Introduction to Artificial Intelligence Local Search CS 486/686: Introduction to Artificial Intelligence 1 Overview Uninformed Search Very general: assumes no knowledge about the problem BFS, DFS, IDS Informed Search Heuristics A* search and

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Approximation Algorithms and Heuristics November 6, 2015 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff INRIA Lille Nord Europe 2 Exercise: The Knapsack Problem

More information

Algorithm Design (4) Metaheuristics

Algorithm Design (4) Metaheuristics Algorithm Design (4) Metaheuristics Takashi Chikayama School of Engineering The University of Tokyo Formalization of Constraint Optimization Minimize (or maximize) the objective function f(x 0,, x n )

More information

Chapter 14 Global Search Algorithms

Chapter 14 Global Search Algorithms Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.

More information

Multi-Objective Optimization Using Genetic Algorithms

Multi-Objective Optimization Using Genetic Algorithms Multi-Objective Optimization Using Genetic Algorithms Mikhail Gaerlan Computational Physics PH 4433 December 8, 2015 1 Optimization Optimization is a general term for a type of numerical problem that involves

More information

Using Penalties instead of Rewards: Solving OCST Problems with Problem-Specific Guided Local Search

Using Penalties instead of Rewards: Solving OCST Problems with Problem-Specific Guided Local Search Using Penalties instead of Rewards: Solving OCST Problems with Problem-Specific Guided Local Search Wolfgang Steitz, Franz Rothlauf Working Paper 01/2011 March 2011 Working Papers in Information Systems

More information

Hyperplane Ranking in. Simple Genetic Algorithms. D. Whitley, K. Mathias, and L. Pyeatt. Department of Computer Science. Colorado State University

Hyperplane Ranking in. Simple Genetic Algorithms. D. Whitley, K. Mathias, and L. Pyeatt. Department of Computer Science. Colorado State University Hyperplane Ranking in Simple Genetic Algorithms D. Whitley, K. Mathias, and L. yeatt Department of Computer Science Colorado State University Fort Collins, Colorado 8523 USA whitley,mathiask,pyeatt@cs.colostate.edu

More information

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Evolutionary Computation Algorithms for Cryptanalysis: A Study Evolutionary Computation Algorithms for Cryptanalysis: A Study Poonam Garg Information Technology and Management Dept. Institute of Management Technology Ghaziabad, India pgarg@imt.edu Abstract The cryptanalysis

More information

Dynamic Representations and Escaping Local Optima: Improving Genetic Algorithms and Local Search

Dynamic Representations and Escaping Local Optima: Improving Genetic Algorithms and Local Search Dynamic Representations and Escaping Local Optima: Improving Genetic Algorithms and Local Search Laura Barbulescu, Jean-Paul Watson, and L. Darrell Whitley Computer Science Department Colorado State University

More information

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms Outline Best-first search Greedy best-first search A* search Heuristics Local search algorithms Hill-climbing search Beam search Simulated annealing search Genetic algorithms Constraint Satisfaction Problems

More information

The Genetic Algorithm for finding the maxima of single-variable functions

The Genetic Algorithm for finding the maxima of single-variable functions Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

Using Genetic Algorithms to Solve the Box Stacking Problem

Using Genetic Algorithms to Solve the Box Stacking Problem Using Genetic Algorithms to Solve the Box Stacking Problem Jenniffer Estrada, Kris Lee, Ryan Edgar October 7th, 2010 Abstract The box stacking or strip stacking problem is exceedingly difficult to solve

More information

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation

More information

x n+1 = x n f(x n) f (x n ), (1)

x n+1 = x n f(x n) f (x n ), (1) 1 Optimization The field of optimization is large and vastly important, with a deep history in computer science (among other places). Generally, an optimization problem is defined by having a score function

More information

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2 Hybridization EVOLUTIONARY COMPUTING Hybrid Evolutionary Algorithms hybridization of an EA with local search techniques (commonly called memetic algorithms) EA+LS=MA constructive heuristics exact methods

More information

Review: Final Exam CPSC Artificial Intelligence Michael M. Richter

Review: Final Exam CPSC Artificial Intelligence Michael M. Richter Review: Final Exam Model for a Learning Step Learner initially Environm ent Teacher Compare s pe c ia l Information Control Correct Learning criteria Feedback changed Learner after Learning Learning by

More information

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Greedy Algorithms (continued) The best known application where the greedy algorithm is optimal is surely

More information

Parameter Control of Genetic Algorithms by Learning and Simulation of Bayesian Networks

Parameter Control of Genetic Algorithms by Learning and Simulation of Bayesian Networks Submitted Soft Computing Parameter Control of Genetic Algorithms by Learning and Simulation of Bayesian Networks C. Bielza,*, J.A. Fernández del Pozo, P. Larrañaga Universidad Politécnica de Madrid, Departamento

More information

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS In: Journal of Applied Statistical Science Volume 18, Number 3, pp. 1 7 ISSN: 1067-5817 c 2011 Nova Science Publishers, Inc. MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS Füsun Akman

More information

Outline of the module

Outline of the module Evolutionary and Heuristic Optimisation (ITNPD8) Lecture 2: Heuristics and Metaheuristics Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ Computing Science and Mathematics, School of Natural Sciences University

More information

A Similarity-Based Mating Scheme for Evolutionary Multiobjective Optimization

A Similarity-Based Mating Scheme for Evolutionary Multiobjective Optimization A Similarity-Based Mating Scheme for Evolutionary Multiobjective Optimization Hisao Ishibuchi and Youhei Shibata Department of Industrial Engineering, Osaka Prefecture University, - Gakuen-cho, Sakai,

More information

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

More information

The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination

The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination INFOCOMP 20 : The First International Conference on Advanced Communications and Computation The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination Delmar Broglio Carvalho,

More information

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems FIFTH INTERNATIONAL CONFERENCE ON HYDROINFORMATICS 1-5 July 2002, Cardiff, UK C05 - Evolutionary algorithms in hydroinformatics An evolutionary annealing-simplex algorithm for global optimisation of water

More information

A New Crossover Technique for Cartesian Genetic Programming

A New Crossover Technique for Cartesian Genetic Programming A New Crossover Technique for Cartesian Genetic Programming Genetic Programming Track Janet Clegg Intelligent Systems Group, Department of Electronics University of York, Heslington York, YO DD, UK jc@ohm.york.ac.uk

More information

Neural Network Weight Selection Using Genetic Algorithms

Neural Network Weight Selection Using Genetic Algorithms Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks

More information

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize.

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize. Cornell University, Fall 2017 CS 6820: Algorithms Lecture notes on the simplex method September 2017 1 The Simplex Method We will present an algorithm to solve linear programs of the form maximize subject

More information

Formally-Proven Kosaraju s algorithm

Formally-Proven Kosaraju s algorithm Formally-Proven Kosaraju s algorithm Laurent Théry Laurent.Thery@sophia.inria.fr Abstract This notes explains how the Kosaraju s algorithm that computes the strong-connected components of a directed graph

More information

DERIVATIVE-FREE OPTIMIZATION

DERIVATIVE-FREE OPTIMIZATION DERIVATIVE-FREE OPTIMIZATION Main bibliography J.-S. Jang, C.-T. Sun and E. Mizutani. Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence. Prentice Hall, New Jersey,

More information

3. Genetic local search for Earth observation satellites operations scheduling

3. Genetic local search for Earth observation satellites operations scheduling Distance preserving recombination operator for Earth observation satellites operations scheduling Andrzej Jaszkiewicz Institute of Computing Science, Poznan University of Technology ul. Piotrowo 3a, 60-965

More information

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Nasser Sadati Abstract Particle Swarm Optimization (PSO) algorithms recently invented as intelligent optimizers with several highly

More information

Recombination of Similar Parents in EMO Algorithms

Recombination of Similar Parents in EMO Algorithms H. Ishibuchi and K. Narukawa, Recombination of parents in EMO algorithms, Lecture Notes in Computer Science 341: Evolutionary Multi-Criterion Optimization, pp. 265-279, Springer, Berlin, March 25. (Proc.

More information

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION Nedim TUTKUN nedimtutkun@gmail.com Outlines Unconstrained Optimization Ackley s Function GA Approach for Ackley s Function Nonlinear Programming Penalty

More information

4 INFORMED SEARCH AND EXPLORATION. 4.1 Heuristic Search Strategies

4 INFORMED SEARCH AND EXPLORATION. 4.1 Heuristic Search Strategies 55 4 INFORMED SEARCH AND EXPLORATION We now consider informed search that uses problem-specific knowledge beyond the definition of the problem itself This information helps to find solutions more efficiently

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2018 Beyond Classical Search Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed

More information

Evolutionary Algorithms

Evolutionary Algorithms Evolutionary Algorithms Proposal for a programming project for INF431, Spring 2014 version 14-02-19+23:09 Benjamin Doerr, LIX, Ecole Polytechnique Difficulty * *** 1 Synopsis This project deals with the

More information

Pseudo-code for typical EA

Pseudo-code for typical EA Extra Slides for lectures 1-3: Introduction to Evolutionary algorithms etc. The things in slides were more or less presented during the lectures, combined by TM from: A.E. Eiben and J.E. Smith, Introduction

More information

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem etic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem R. O. Oladele Department of Computer Science University of Ilorin P.M.B. 1515, Ilorin, NIGERIA

More information

Fall 09, Homework 5

Fall 09, Homework 5 5-38 Fall 09, Homework 5 Due: Wednesday, November 8th, beginning of the class You can work in a group of up to two people. This group does not need to be the same group as for the other homeworks. You

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 Outline Local search techniques and optimization Hill-climbing

More information

Chapter 9: Genetic Algorithms

Chapter 9: Genetic Algorithms Computational Intelligence: Second Edition Contents Compact Overview First proposed by Fraser in 1957 Later by Bremermann in 1962 and Reed et al in 1967 Popularized by Holland in 1975 Genetic algorithms

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

Distributed minimum spanning tree problem

Distributed minimum spanning tree problem Distributed minimum spanning tree problem Juho-Kustaa Kangas 24th November 2012 Abstract Given a connected weighted undirected graph, the minimum spanning tree problem asks for a spanning subtree with

More information

Evolutionary Algorithms: Perfecting the Art of Good Enough. Liz Sander

Evolutionary Algorithms: Perfecting the Art of Good Enough. Liz Sander Evolutionary Algorithms: Perfecting the Art of Good Enough Liz Sander Source: wikipedia.org Source: fishbase.org Source: youtube.com Sometimes, we can t find the best solution. Sometimes, we can t find

More information

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems CS 331: Artificial Intelligence Local Search 1 1 Tough real-world problems Suppose you had to solve VLSI layout problems (minimize distance between components, unused space, etc.) Or schedule airlines

More information

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China

More information

Single Candidate Methods

Single Candidate Methods Single Candidate Methods In Heuristic Optimization Based on: [3] S. Luke, "Essentials of Metaheuristics," [Online]. Available: http://cs.gmu.edu/~sean/book/metaheuristics/essentials.pdf. [Accessed 11 May

More information

GENETIC ALGORITHM with Hands-On exercise

GENETIC ALGORITHM with Hands-On exercise GENETIC ALGORITHM with Hands-On exercise Adopted From Lecture by Michael Negnevitsky, Electrical Engineering & Computer Science University of Tasmania 1 Objective To understand the processes ie. GAs Basic

More information

Gradient Descent. 1) S! initial state 2) Repeat: Similar to: - hill climbing with h - gradient descent over continuous space

Gradient Descent. 1) S! initial state 2) Repeat: Similar to: - hill climbing with h - gradient descent over continuous space Local Search 1 Local Search Light-memory search method No search tree; only the current state is represented! Only applicable to problems where the path is irrelevant (e.g., 8-queen), unless the path is

More information

Distributed Probabilistic Model-Building Genetic Algorithm

Distributed Probabilistic Model-Building Genetic Algorithm Distributed Probabilistic Model-Building Genetic Algorithm Tomoyuki Hiroyasu 1, Mitsunori Miki 1, Masaki Sano 1, Hisashi Shimosaka 1, Shigeyoshi Tsutsui 2, and Jack Dongarra 3 1 Doshisha University, Kyoto,

More information

Automata Construct with Genetic Algorithm

Automata Construct with Genetic Algorithm Automata Construct with Genetic Algorithm Vít Fábera Department of Informatics and Telecommunication, Faculty of Transportation Sciences, Czech Technical University, Konviktská 2, Praha, Czech Republic,

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Local Search Vibhav Gogate The University of Texas at Dallas Some material courtesy of Luke Zettlemoyer, Dan Klein, Dan Weld, Alex Ihler, Stuart Russell, Mausam Systematic Search:

More information

Kapitel 5: Local Search

Kapitel 5: Local Search Inhalt: Kapitel 5: Local Search Gradient Descent (Hill Climbing) Metropolis Algorithm and Simulated Annealing Local Search in Hopfield Neural Networks Local Search for Max-Cut Single-flip neighborhood

More information

Coevolving Functions in Genetic Programming: Classification using K-nearest-neighbour

Coevolving Functions in Genetic Programming: Classification using K-nearest-neighbour Coevolving Functions in Genetic Programming: Classification using K-nearest-neighbour Manu Ahluwalia Intelligent Computer Systems Centre Faculty of Computer Studies and Mathematics University of the West

More information

Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts

Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts IEEE Symposium Series on Computational Intelligence Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts Heiner Zille Institute of Knowledge and Language Engineering University of Magdeburg, Germany

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Part 10: Genetic Algorithm Basics Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic

More information

Mutations for Permutations

Mutations for Permutations Mutations for Permutations Insert mutation: Pick two allele values at random Move the second to follow the first, shifting the rest along to accommodate Note: this preserves most of the order and adjacency

More information

Evolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.

Evolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague. Evolutionary Algorithms: Lecture 4 Jiří Kubaĺık Department of Cybernetics, CTU Prague http://labe.felk.cvut.cz/~posik/xe33scp/ pmulti-objective Optimization :: Many real-world problems involve multiple

More information

One-mode Additive Clustering of Multiway Data

One-mode Additive Clustering of Multiway Data One-mode Additive Clustering of Multiway Data Dirk Depril and Iven Van Mechelen KULeuven Tiensestraat 103 3000 Leuven, Belgium (e-mail: dirk.depril@psy.kuleuven.ac.be iven.vanmechelen@psy.kuleuven.ac.be)

More information

A Genetic Algorithm Applied to Graph Problems Involving Subsets of Vertices

A Genetic Algorithm Applied to Graph Problems Involving Subsets of Vertices A Genetic Algorithm Applied to Graph Problems Involving Subsets of Vertices Yaser Alkhalifah Roger L. Wainwright Department of Mathematical Department of Mathematical and Computer Sciences and Computer

More information

Learning the Neighborhood with the Linkage Tree Genetic Algorithm

Learning the Neighborhood with the Linkage Tree Genetic Algorithm Learning the Neighborhood with the Linkage Tree Genetic Algorithm Dirk Thierens 12 and Peter A.N. Bosman 2 1 Institute of Information and Computing Sciences Universiteit Utrecht, The Netherlands 2 Centrum

More information

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery Monika Sharma 1, Deepak Sharma 2 1 Research Scholar Department of Computer Science and Engineering, NNSS SGI Samalkha,

More information

Gen := 0. Create Initial Random Population. Termination Criterion Satisfied? Yes. Evaluate fitness of each individual in population.

Gen := 0. Create Initial Random Population. Termination Criterion Satisfied? Yes. Evaluate fitness of each individual in population. An Experimental Comparison of Genetic Programming and Inductive Logic Programming on Learning Recursive List Functions Lappoon R. Tang Mary Elaine Cali Raymond J. Mooney Department of Computer Sciences

More information

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman Lecture 4 3B1B Optimization Michaelmas 2017 A. Zisserman Convexity Robust cost functions Optimizing non-convex functions grid search branch and bound simulated annealing evolutionary optimization The Optimization

More information

3 No-Wait Job Shops with Variable Processing Times

3 No-Wait Job Shops with Variable Processing Times 3 No-Wait Job Shops with Variable Processing Times In this chapter we assume that, on top of the classical no-wait job shop setting, we are given a set of processing times for each operation. We may select

More information

CHAPTER 4 GENETIC ALGORITHM

CHAPTER 4 GENETIC ALGORITHM 69 CHAPTER 4 GENETIC ALGORITHM 4.1 INTRODUCTION Genetic Algorithms (GAs) were first proposed by John Holland (Holland 1975) whose ideas were applied and expanded on by Goldberg (Goldberg 1989). GAs is

More information

Incorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms

Incorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms H. Ishibuchi, T. Doi, and Y. Nojima, Incorporation of scalarizing fitness functions into evolutionary multiobjective optimization algorithms, Lecture Notes in Computer Science 4193: Parallel Problem Solving

More information

Introduction to Design Optimization: Search Methods

Introduction to Design Optimization: Search Methods Introduction to Design Optimization: Search Methods 1-D Optimization The Search We don t know the curve. Given α, we can calculate f(α). By inspecting some points, we try to find the approximated shape

More information

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications

More information

Global Optimization. for practical engineering applications. Harry Lee 4/9/2018 CEE 696

Global Optimization. for practical engineering applications. Harry Lee 4/9/2018 CEE 696 Global Optimization for practical engineering applications Harry Lee 4/9/2018 CEE 696 Table of contents 1. Global Optimization 1 Global Optimization Global optimization Figure 1: Fig 2.2 from Nocedal &

More information

A genetic algorithm for kidney transplantation matching

A genetic algorithm for kidney transplantation matching A genetic algorithm for kidney transplantation matching S. Goezinne Research Paper Business Analytics Supervisors: R. Bekker and K. Glorie March 2016 VU Amsterdam Faculty of Exact Sciences De Boelelaan

More information

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS 6.1 Introduction Gradient-based algorithms have some weaknesses relative to engineering optimization. Specifically, it is difficult to use gradient-based algorithms

More information

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'

More information

Genetic Algorithms and Genetic Programming Lecture 7

Genetic Algorithms and Genetic Programming Lecture 7 Genetic Algorithms and Genetic Programming Lecture 7 Gillian Hayes 13th October 2006 Lecture 7: The Building Block Hypothesis The Building Block Hypothesis Experimental evidence for the BBH The Royal Road

More information

Lecture 6: The Building Block Hypothesis. Genetic Algorithms and Genetic Programming Lecture 6. The Schema Theorem Reminder

Lecture 6: The Building Block Hypothesis. Genetic Algorithms and Genetic Programming Lecture 6. The Schema Theorem Reminder Lecture 6: The Building Block Hypothesis 1 Genetic Algorithms and Genetic Programming Lecture 6 Gillian Hayes 9th October 2007 The Building Block Hypothesis Experimental evidence for the BBH The Royal

More information

Santa Fe Trail Problem Solution Using Grammatical Evolution

Santa Fe Trail Problem Solution Using Grammatical Evolution 2012 International Conference on Industrial and Intelligent Information (ICIII 2012) IPCSIT vol.31 (2012) (2012) IACSIT Press, Singapore Santa Fe Trail Problem Solution Using Grammatical Evolution Hideyuki

More information

CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS

CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS CHAPTER 4 CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS 4.1 Introduction Optical character recognition is one of

More information

Two approaches. Local Search TSP. Examples of algorithms using local search. Local search heuristics - To do list

Two approaches. Local Search TSP. Examples of algorithms using local search. Local search heuristics - To do list Unless P=NP, there is no polynomial time algorithm for SAT, MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP,. But we have to solve (instances of) these problems anyway what

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 2 Outline Local search techniques and optimization Hill-climbing

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 2 Outline Local search techniques and optimization Hill-climbing

More information