Evolutionary Algorithms

Similar documents
A Tight Analysis of the (1 + 1)-EA for the Single Source Shortest Path Problem

Speeding up Evolutionary Algorithms Through Restricted Mutation Operators

Introduction to Optimization

Introduction to Optimization

Computational Complexity of EC in Combinatorial Optimisation

An Introduction to Evolutionary Algorithms

Evolutionary Algorithms and Dynamic Programming

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

OPTIMIZATION METHODS. For more information visit: or send an to:

Theory of Randomized Search Heuristics in Combinatorial Optimization

Fixed-Parameter Evolutionary Algorithms and the Vertex Cover Problem

Chapter 9: Genetic Algorithms

Genetic Algorithms. Chapter 3

Heuristic Optimisation

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

Evolutionary Computation for Combinatorial Optimization

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem

Artificial Intelligence Application (Genetic Algorithm)

Evolutionary Computation Part 2

CS5401 FS2015 Exam 1 Key

Genetic Algorithms Variations and Implementation Issues

Computational Intelligence

Analyses of Simple Hybrid Algorithms for the Vertex Cover Problem

Introduction to Optimization

GENETIC ALGORITHM with Hands-On exercise

Expected Runtimes of Evolutionary Algorithms for the Eulerian Cycle Problem

Crossover Can Provably be Useful in Evolutionary Computation

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Approximate Evolution Strategy using Stochastic Ranking

Non-deterministic Search techniques. Emma Hart

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms

Time Complexity Analysis of the Genetic Algorithm Clustering Method

The Genetic Algorithm for finding the maxima of single-variable functions

Theoretical Foundations of SBSE. Xin Yao CERCIA, School of Computer Science University of Birmingham

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

CHAPTER 4 GENETIC ALGORITHM

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114

Using Genetic Algorithms in Integer Programming for Decision Support

Constraint Handling. Fernando Lobo. University of Algarve

arxiv: v1 [cs.ne] 9 Jan 2014

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

Evolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

Selection Intensity in Asynchronous Cellular Evolutionary Algorithms

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

Pre-requisite Material for Course Heuristics and Approximation Algorithms

METAHEURISTIC. Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal.

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

Pseudo-code for typical EA

Introduction to Evolutionary Computation

1 Standard definitions and algorithm description

Outline. CS 6776 Evolutionary Computation. Numerical Optimization. Fitness Function. ,x 2. ) = x 2 1. , x , 5.0 x 1.

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem

Approximation-Guided Evolutionary Multi-Objective Optimization

Optimization of Function by using a New MATLAB based Genetic Algorithm Procedure

Outline of the module

METAHEURISTICS Genetic Algorithm

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Genetic Algorithm for optimization using MATLAB

Solving A Nonlinear Side Constrained Transportation Problem. by Using Spanning Tree-based Genetic Algorithm. with Fuzzy Logic Controller

DERIVATIVE-FREE OPTIMIZATION

MINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS

An Evolutionary Algorithm with Stochastic Hill-Climbing for the Edge-Biconnectivity Augmentation Problem

Genetic Algorithms. Kang Zheng Karl Schober

Reducing Graphic Conflict In Scale Reduced Maps Using A Genetic Algorithm

Mutations for Permutations

A motivated definition of exploitation and exploration

Evolutionary Algorithms

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Adaptive Elitist-Population Based Genetic Algorithm for Multimodal Function Optimization

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm


arxiv: v1 [cs.ne] 22 Mar 2016

Evolutionary Algorithms

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

Multi-objective Optimization

A Comparative Study on Nature Inspired Algorithms with Firefly Algorithm

Introduction to Genetic Algorithms

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design

Diversity Optimization and Parameterized Analysis of Heuristic Search Methods for Combinatorial Optimization Problems

Abstract. 1 Introduction

Evolving SQL Queries for Data Mining

Outline of Lecture. Scope of Optimization in Practice. Scope of Optimization (cont.)

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1

4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction

Fixed- Parameter Evolu2onary Algorithms

Improved S-CDAS using Crossover Controlling the Number of Crossed Genes for Many-objective Optimization

What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search

Dr.-Ing. Johannes Will CAD-FEM GmbH/DYNARDO GmbH dynamic software & engineering GmbH

Geometric Semantic Genetic Programming ~ Theory & Practice ~

Genetic Algorithm for Finding Shortest Path in a Network

Transcription:

Evolutionary Algorithms Per Kristian Lehre School of Computer Science University of Nottingham NATCOR 2016 Evolutionary Algorithms - Per Kristian Lehre 1/46

Optimisation Given a function f : X R, find an x X such that f (x) f (y) for all y X. A general problem with lots of applications! Can be solved efficiently in many special cases mathematical optimisation techniques optimisation variants of problems in P Optimisation is thought to be hard in general approximation algorithms exact exponential algorithms problem-independent, randomised search heuristics, e.g. evolutionary algorithms Evolutionary Algorithms - Per Kristian Lehre 2/46

Evolution Selection Variation Evolutionary Algorithms - Per Kristian Lehre 3/46

Evolutionary Algorithms Generate the initial population P(0) at random, and set t 0. repeat Evaluate the fitness of each individual in P(t). Select parents from P(t) based on their fitness. Obtain population P(t + 1) by applying crossover and mutation to parents. Set t t + 1. until termination criterion satisfied. Basic idea from natural evolution and population genetics. Survival of the fittest. Evolutionary Algorithms - Per Kristian Lehre 4/46

Not A New Idea... Evolutionary Strategies (ES), a type of EAs, invented by Hans-Paul Schwefel and Ingo Rechenberg at TUB in 1963. Optimisation of wing shape using Konrad Zuse s Z23. American and German school first met at PPSN in Dortmund 1990. Der Spiegel, November 18th, 1964 Evolutionary Algorithms - Per Kristian Lehre 5/46

Nature Inspired Optimisation Problems Continuous vs. combinatorial Single- vs multi-objective Dynamic and stochastic... Algorithms Evolutionary Algorithms Genetic Algorithms, Evolutionary Strategies, Genetic Programming, Estimation of Distribution,... Swarm Optimisation Ant Colony Optimisation, PSO, Bee hives...... Evolutionary Algorithms - Per Kristian Lehre 6/46

Outline 1 Introduction Nature Inspired Optimisation 2 Evolutionary Algorithms Representations Genetic Operators Selection Mechanisms Diversity Mechanisms Constraint Handling Techniques 3 Runtime Analysis The Black Box Scenario and No Free Lunch Runtime of (1+1) EA on OneMax Overview of Techniques and Results 4 Summary Evolutionary Algorithms - Per Kristian Lehre 7/46

A Simple Evolutionary Algorithm Simple Evolutionary Algorithm Generate the initial population P(0) at random, and set t 0. repeat Evaluate the fitness of each individual in P(t). Select parents from P(t) based on their fitness. Obtain population P(t + 1) by applying crossover and mutation to parents. Set t t + 1. until termination criterion satisfied. Basic idea from natural evolution and population genetics. Survival of the fittest. Evolutionary Algorithms - Per Kristian Lehre 8/46

Representations Representations of candidate solutions on which the genetic operators can operate. Bitstrings commonly used in combinatorial optimisation. Other representations are possible in conjunction with specialised genetic operators. trees, permutations etc. In general, genotype-phenotype mappings φ : G P G set of genotypes / chromosomes P set of phenotypes / solutions Fitness function f : P R Evolutionary Algorithms - Per Kristian Lehre 9/46

Locality in Representations [Rothlauf, 2006] f φ f f x x P G P G Rule of thumb Small genotypic change small phenotypic change. Large genotypic change large phenotypic change. Evolutionary Algorithms - Per Kristian Lehre 10/46

Exploration and Exploitation Exploration of new parts of search space Mutation operators Recombination operators Exploitation of promising genetic material Selection mechanism Evolutionary Algorithms - Per Kristian Lehre 11/46

Mutation operators for bitstrings The mutation operator introduces small, random changes to an individual s chromosome. Local Mutation One randomly chosen bit is flipped. Global Mutation Each bit flipped independently with a given probability p m, called the per bit mutation rate, which is often 1/n, where n is the chromosome length. ( ) n Pr [k bits flipped] = pm k (1 p m ) n k. k Mutation rate Note the difference between per bit (gene) and per chromosome (individual) mutation rates. Evolutionary Algorithms - Per Kristian Lehre 12/46

Recombination operators - One point crossover The recombination operator generates an offspring individual whose chromosome is composed from the parents chromosomes. Crossover rate probability of applying crossover to parents One point crossover between parents x and y Randomly select a crossover point p in {1, 2,..., n}. Offspring 1 is x 1 x p y p+1 y n. Offspring 2 is y 1 y p x p+1 x n. Example Parent x: 101011 1010 Offspring 1: 101011 1110 Parent y: 010100 1110 Offspring 2: 010100 1010 Evolutionary Algorithms - Per Kristian Lehre 13/46

Recombination operators - Multi-point crossover k-point crossover between parents x and y Randomly select k crossover points p 1 < < p k in {1, 2,..., n}. Offspring 1 is x 1 x p1 y p1 +1 y p2 x p2 +1 x p3 etc. Offspring 2 is y 1 y p1 x p1 +1 x p2 y p2 +1 y p3 etc. Example (2-point crossover) Parent x: 101 011 1010 Offspring 1: 101 100 1010 Parent y: 010 100 1110 Offspring 2: 010 011 1110 Evolutionary Algorithms - Per Kristian Lehre 14/46

Recombination operators - Uniform crossover Uniform crossover between parents x and y Select a bitstring z of length n uniformly at random. for all i in 1 to n if z i = 1 then bit i in offspring 1 is x i else y i. if z i = 1 then bit i in offspring 2 is y i else x i. Example z =1010001110 Parent x: 1010111010 Offspring 1: 1111001010 Parent y: 0101001110 Offspring 2: 0000111110 Evolutionary Algorithms - Per Kristian Lehre 15/46

Selection and Reproduction Selection emphasises the better solutions in a population One or more copies of good solutions. Inferior solutions are much less likely to be selected. Not normally considered a search operator, but influences search significantly Selection can be used either before or after search operators. When selection is used before search operators, the process of choosing the next generation from the union of all parents and offspring is sometimes called reproduction. Generational gap of EA refers to the overlap (i.e., individuals that did not go through any search operators) between the old and new generations. The two extremes are generational EAs and steady-state EAs. 1-elitism can be regarded as having a generational gap of 1. Evolutionary Algorithms - Per Kristian Lehre 16/46

Fitness Proportional Selection Probability of selecting individual x from population P is Pr [x] = f (x) y P f (y). Use raw fitness in computing selection probabilities. Does not allow negative fitness values. Also known as roulette wheel selection. Weaknesses Domination of super individuals in early generations. Slow convergence in later generations. Fitness scaling often used in early days to combat problem Fitness function f replaced with a scaled fitness function f. Evolutionary Algorithms - Per Kristian Lehre 17/46

Ranking Selection 1 Sort population from best to worst according to fitness: x (λ 1), x (λ 2), x (λ 3),..., x (0) 2 Select the γ-ranked individual x (γ) with probability Pr [γ], where Pr [γ] is a ranking function, e.g. linear ranking exponential ranking power ranking geometric ranking Evolutionary Algorithms - Per Kristian Lehre 18/46

Linear ranking Population size λ, and rank γ, 0 γ λ 1, (0 worst) Linear ranking λ 1 Rank Pr linear [γ] := α + (β α) γ λ λ 1 γ where λ 1 γ=0 Pr linear [γ] = 1 implies α + β = 2 and 1 β 2. In expectation best individual reproduced β times worst individual reproduced α times. 0 α β Evolutionary Algorithms - Per Kristian Lehre 19/46

Other ranking functions Power ranking Geometric ranking ( ) Pr power [γ] := α + (β α) γ k λ 1, C Exponential ranking Pr geom [γ] := α (1 α)λ 1 γ, C Pr exp [γ] := 1 e γ, C where C is a normalising factor and 0 < α < β. Evolutionary Algorithms - Per Kristian Lehre 20/46

Tournament Selection Tournament selection with tournament size k Randomly sample a subset P of k individuals from population P. Select the individual in P with highest fitness. Often, tournament size k = 2 is used. Evolutionary Algorithms - Per Kristian Lehre 21/46

(µ + λ) and (µ, λ) selection Origins in Evolution Strategies. (µ + λ)-selection Parent population of size µ. Generate λ offspring from randomly chosen parents. Next population is µ best among parents and offspring. (µ, λ)-selection (where λ > µ) Parent population of size µ. Generate λ offspring from randomly chosen parents. Next population is µ best among offspring. Evolutionary Algorithms - Per Kristian Lehre 22/46

Selection pressure Degree to which selection emphasises the better individuals. How can selection pressure be measured and adjusted? Take-over time τ [Goldberg and Deb, 1991, Bäck, 1994]. 1 Initial population with unique fittest individual x. 2 Apply selection operator repeatedly with no other operators. 3 τ is number of generations until pop. consists of x only. Higher take-over time lower selection pressure. Fitness prop. Linear ranking Tournament τ λ ln λ c τ 2 ln(λ 1) β 1 1 < β < 2 τ ln λ+ln ln λ ln k (µ, λ) τ = ln λ ln(λ/µ) assuming fitness f (x) = exp(cx) tournament size k Evolutionary Algorithms - Per Kristian Lehre 23/46

Diversity Mechanisms Fitness sharing g(x) := s(x, y) := 1 y,d(x,y) σ f (x) s(x, y) d(x, y) σ Crowding Standard Crowding Deterministic Crowding [Sareni and Krahenbuhl, 1998] [Friedrich et al., 2009] Evolutionary Algorithms - Per Kristian Lehre 24/46

Constraint Handling Techniques Constrained optimisation f : X R g i : X R objective function inequality constraint(s) feasible infeasible Maximise f (x) while g i (x) 0. Penalty approaches: death, static, dynamic, adaptive,... Multi-objective optimisation Repair approaches Decoders [Coello, 2002] Evolutionary Algorithms - Per Kristian Lehre 25/46

Analysis of Evolutionary Algorithms Criteria for evaluating algorithms 1 Correctness Does the algorithm always give the correct output? 2 Computational Complexity How much computational resources does the algorithm require to solve the problem? Same criteria also applicable to evolutionary algorithms 1 Correctness. Discover global optimum in finite time? 2 Computational Complexity. Time (number of function evaluations) most relevant computational resource. Evolutionary Algorithms - Per Kristian Lehre 26/46

Computational Complexity of EAs Runtime 0 10000 0 50 100 Instance Size Prediction of resources needed for a given instance. Usually runtime as function of instance size. Number of fitness evaluations before finding optimum. Exponential runtime = Inefficient algorithm. Polynomial runtime = Efficient algorithm. Evolutionary Algorithms - Per Kristian Lehre 27/46

Black Box Scenario A f Function class F [Droste et al., 2006] Photo: E. Gerhard (1846). Evolutionary Algorithms - Per Kristian Lehre 28/46

Black Box Scenario f (x 1 ), f (x 2 ), f (x 3 ),... x 1, x 2, x 3,... A f Function class F [Droste et al., 2006] Photo: E. Gerhard (1846). Evolutionary Algorithms - Per Kristian Lehre 28/46

Black Box Scenario f (x 1 ), f (x 2 ), f (x 3 ),..., f (x t ) x 1, x 2, x 3,..., x t A Worst case runtime max f F T A,f Function class F f Average case runtime is Pr [f ] T(A, f ) f F [Droste et al., 2006] Photo: E. Gerhard (1846). Evolutionary Algorithms - Per Kristian Lehre 28/46

No Free Lunch Theorem ([Wolpert and Macready, 1997, Droste et al., 2002b]) Let F be a set of functions f : S B, where S and B are finite sets, B totally ordered. If F is closed under permutations, then the average case runtime over F is the same for all search heuristics. No search heuristic best on all problems. Need to consider algorithms on specific problem classes F. Function classes closed under permutation are not interesting... (NB! See [Auger and Teytaud, 2008] for continuous spaces.) Evolutionary Algorithms - Per Kristian Lehre 29/46

Expected Runtime and Success Probability The runtime T A,f is a random variable. Expected runtime E [T A,f ] = t=1 tpr [T A,f = t]. Success probability within t(n) steps Pr [T A,f t(n)]. Evolutionary Algorithms - Per Kristian Lehre 30/46

(1+1) Evolutionary Algorithm 1: Sample x uniformly at random from {0, 1} n. 2: repeat 3: x x. 4: Flip each bit of x independently with probability 1/n. 5: if f (x ) f (x) then 6: x x. 7: end if 8: until termination condition met. Special case of the (µ+λ) EA. Starting point for rigorous runtime analysis of EAs, e.g. [Mühlenbein, 1992, Garnier et al., 1999, Droste et al., 2002a] Evolutionary Algorithms - Per Kristian Lehre 31/46

Artificial Fitness Levels - Upper bounds Search space partitioned into m subsets A 1, A 2,..., A m, with increasing fitness, i.e. f (A i ) < f (A j ) for all i < j, and f (A m ) optimal. Fitness A m A m 1.. A 3 A 2 A 1 p i : Probability of jumping from A i to any A j, i < j. T i : Time to jump from A i to any A j, i < j. Expected runtime E [T] E [T 1 + T 2 + + T m ] = E [T 1 ] + E [T 2 ] + + E [T m ] 1/p 1 + 1/p 2 + 1/p m. Evolutionary Algorithms - Per Kristian Lehre 32/46

Artificial Fitness Levels - Upper bound on OneMax Partitioning of search space in fitness levels OneMax(x) := x 1 + x 2 + + x n. A i : all bitstrings with i 0-bits. p i : probability of decreasing number of 1-bits, from within A i (at least prob. of flipping one 0-bit, and no other bits) p i i 1 ( n 1 1 n ) n 1 } {{ } 1/e i en. Evolutionary Algorithms - Per Kristian Lehre 33/46

Artificial Fitness Levels - Upper bound on OneMax Partitioning of search space in fitness levels OneMax(x) := x 1 + x 2 + + x n. A i : all bitstrings with i 0-bits. p i : probability of decreasing number of 1-bits, from within A i (at least prob. of flipping one 0-bit, and no other bits) Expected runtime p i i 1 ( n 1 1 n E [T OneMax ] n i=1 ) n 1 } {{ } 1/e 1 p i = n i=1 en i i en. = O(n ln n). Evolutionary Algorithms - Per Kristian Lehre 33/46

Artificial Fitness Levels - Exercise Estimate an upper bound on the expected runtime of (1+1) EA on n i LeadingOnes(x) := x i. i=1 j=1 Leading 1-bits. Random bitstring. {}}{{}}{ x = 1111111111111111 0. First 0-bit. Evolutionary Algorithms - Per Kristian Lehre 34/46

Artificial Fitness Levels - Exercise Estimate an upper bound on the expected runtime of (1+1) EA on n i LeadingOnes(x) := x i. i=1 j=1 Leading 1-bits. Random bitstring. {}}{{}}{ x = 1111111111111111 0. First 0-bit. Artificial Fitness Levels, with level given by # leading 1-bits Probability of increasing p i 1/en for all i. E [T LeadingOnes ] n i=1 1 p i = O(n 2 ). Evolutionary Algorithms - Per Kristian Lehre 34/46

Analytical Tool Box Artificial Fitness Levels [Wegener and Witt, 2005] Markov s Inequality, Chernoff Bounds [Motwani and Raghavan, 1995] Typical Runs Expected Multiplicative Weight Decrease [Neumann and Wegener, 2007] Drift Analysis [Hajek, 1982] Branching Processes [Lehre, 2010] Yao s Minimax Principle [Motwani and Raghavan, 1995] Evolutionary Algorithms - Per Kristian Lehre 35/46

State of the Art [Oliveto et al., 2007b] OneMax (1+1) EA O(n log n) [Mühlenbein, 1992] (1+λ) EA O(λn + n log n) [Jansen et al., 2005] (µ+1) EA O(µn + n log n) [Witt, 2006] 1-ANT O(n 2 ) w.h.p. [Neumann and Witt, 2006] (µ+1) IA O(µn + n log n) [Zarges, 2009] Linear Functions (1+1) EA Θ(n log n) [Droste et al., 2002a] and [He and Yao, 2003] cga Θ(n 2+ε ), ε > 0 const. [Droste, 2006] Max. Matching (1+1) EA e Ω(n), PRAS [Giel and Wegener, 2003] Sorting (1+1) EA Θ(n 2 log n) [Scharnow et al., 2002] SS Shortest Path (1+1) EA O(n 3 log(nw max )) [Baswana et al., 2009] MO (1+1) EA O(n 3 ) [Scharnow et al., 2002] MST (1+1) EA Θ(m 2 log(nw max )) [Neumann and Wegener, 2007] (1+λ) EA O(nλ log(nw max )), λ = m2 [Neumann and Wegener, 2007] n 1-ANT O(mn log(nw max )) [Neumann and Witt, 2008] Max. Clique (1+1) EA Θ(n 5 ) [Storch, 2006] (rand. planar) (16n+1) RLS Θ(n 5/3 ) [Storch, 2006] Eulerian Cycle (1+1) EA Θ(m 2 log m) [Doerr et al., 2007] Partition (1+1) EA PRAS, avg. [Witt, 2005] Vertex Cover (1+1) EA e Ω(n), arb. bad approx. [Friedrich et al., 2007] and [Oliveto et al., 2007a] Set Cover (1+1) EA e Ω(n), arb. bad approx. [Friedrich et al., 2007] SEMO Pol. O(log n)-approx. [Friedrich et al., 2007] Intersection of (1+1) EA 1/p-approximation in [Reichel and Skutella, 2008] p 3 matroids O( E p+2 log( E w max)) UIO/FSM conf. (1+1) EA e Ω(n) [Lehre and Yao, 2007] Evolutionary Algorithms - Per Kristian Lehre 36/46

Summary Evolutionary Algorithms Representations Genetic Operators Selection Mechanisms Runtime Analysis No Free Lunch Theorem Expected Runtime & Success Probability Evolutionary Algorithms - Per Kristian Lehre 37/46

References I Auger, A. and Teytaud, O. (2008). Continuous lunches are free plus the design of optimal optimization algorithms. Algorithmica. Bäck, T. (1994). Selective pressure in evolutionary algorithms: A characterization of selection mechanisms. In Proceedings of the 1st IEEE Conference on Evolutionary Computation (CEC 1994), pages 57 62. IEEE Press. Baswana, S., Biswas, S., Doerr, B., Friedrich, T., Kurur, P. P., and Neumann, F. (2009). Computing single source shortest paths using single-objective fitness. In FOGA 09: Proceedings of the tenth ACM SIGEVO workshop on Foundations of genetic algorithms, pages 59 66, New York, NY, USA. ACM. Coello, C. C. (2002). Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: A survey of the state of the art. Computer Methods in Applied Mechanics and Engineering, 191(11-12):1245 1287. Evolutionary Algorithms - Per Kristian Lehre 38/46

References II Doerr, B., Klein, C., and Storch, T. (2007). Faster evolutionary algorithms by superior graph representation. In Proceedings of the 1st IEEE Symposium on Foundations of Computational Intelligence (FOCI 2007), pages 245 250. Droste, S. (2006). A rigorous analysis of the compact genetic algorithm for linear functions. Natural Computing, 5(3):257 283. Droste, S., Jansen, T., and Wegener, I. (2002a). On the analysis of the (1+1) Evolutionary Algorithm. Theoretical Computer Science, 276:51 81. Droste, S., Jansen, T., and Wegener, I. (2002b). Optimization with randomized search heuristics the (a)nfl theorem, realistic scenarios, and difficult functions. Theoretical Computer Science, 287(1):131 144. Evolutionary Algorithms - Per Kristian Lehre 39/46

References III Droste, S., Jansen, T., and Wegener, I. (2006). Upper and lower bounds for randomized search heuristics in black-box optimization. Theory of Computing Systems, 39(4):525 544. Friedrich, T., Hebbinghaus, N., Neumann, F., He, J., and Witt, C. (2007). Approximating covering problems by randomized search heuristics using multi-objective models. In Proceedings of the 9th annual conference on Genetic and evolutionary computation (GECCO 2007), pages 797 804, New York, NY, USA. ACM Press. Friedrich, T., Oliveto, P. S., Sudholt, D., and Witt, C. (2009). Analysis of diversity-preserving mechanisms for global exploration*. Evol. Comput., 17(4):455 476. Garnier, J., Kallel, L., and Schoenauer, M. (1999). Rigorous hitting times for binary mutations. Evolutionary Computation, 7(2):173 203. Evolutionary Algorithms - Per Kristian Lehre 40/46

References IV Giel, O. and Wegener, I. (2003). Evolutionary algorithms and the maximum matching problem. In Proceedings of the 20th Annual Symposium on Theoretical Aspects of Computer Science (STACS 2003), pages 415 426. Goldberg, D. E. and Deb, K. (1991). A comparative analysis of selection schemes used in genetic algorithms. In Foundations of Genetic Algorithms, pages 69 93. Morgan Kaufmann. Hajek, B. (1982). Hitting-time and occupation-time bounds implied by drift analysis with applications. Advances in Applied Probability, 14(3):502 525. He, J. and Yao, X. (2003). Towards an analytic framework for analysing the computation time of evolutionary algorithms. Artificial Intelligence, 145(1-2):59 97. Evolutionary Algorithms - Per Kristian Lehre 41/46

References V Jansen, T., Jong, K. A. D., and Wegener, I. (2005). On the choice of the offspring population size in evolutionary algorithms. Evolutionary Computation, 13(4):413 440. Lehre, P. K. (2010). Negative drift in populations. In To appear in Proceedings of PPSN 2010-11th International Conference on Parallel Problem Solving From Nature. Lehre, P. K. and Yao, X. (2007). Runtime analysis of (1+1) EA on computing unique input output sequences. In Proceedings of 2007 IEEE Congress on Evolutionary Computation (CEC 2007), pages 1882 1889. IEEE Press. Motwani, R. and Raghavan, P. (1995). Randomized Algorithms. Cambridge University Press. Evolutionary Algorithms - Per Kristian Lehre 42/46

References VI Mühlenbein, H. (1992). How genetic algorithms really work I. Mutation and Hillclimbing. In Proceedings of the Parallel Problem Solving from Nature 2, (PPSN-II), pages 15 26. Elsevier. Neumann, F. and Wegener, I. (2007). Randomized local search, evolutionary algorithms, and the minimum spanning tree problem. Theoretical Computer Science, 378(1):32 40. Neumann, F. and Witt, C. (2006). Runtime analysis of a simple ant colony optimization algorithm. In Proceedings of The 17th International Symposium on Algorithms and Computation (ISAAC 2006), number 4288 in LNCS, pages 618 627. Neumann, F. and Witt, C. (2008). Ant colony optimization and the minimum spanning tree problem. In Proceedings of Learning and Intelligent Optimization (LION 2008), pages 153 166. Evolutionary Algorithms - Per Kristian Lehre 43/46

References VII Oliveto, P. S., He, J., and Yao, X. (2007a). Evolutionary algorithms and the vertex cover problem. In In Proceedings of the IEEE Congress on Evolutionary Computation (CEC 2007). Oliveto, P. S., He, J., and Yao, X. (2007b). Time complexity of evolutionary algorithms for combinatorial optimization: A decade of results. International Journal of Automation and Computing, 4(1):100 106. Reichel, J. and Skutella, M. (2008). Evolutionary algorithms and matroid optimization problems. Algorithmica. Rothlauf, F. (2006). Representations for Genetic and Evolutionary Algorithms. Springer-Verlag Berlin Heidelberg. Sareni, B. and Krahenbuhl, L. (1998). Fitness sharing and niching methods revisited. IEEE Transactions on Evolutionary Computation, 2(3):97 106. Evolutionary Algorithms - Per Kristian Lehre 44/46

References VIII Scharnow, J., Tinnefeld, K., and Wegener, I. (2002). Fitness landscapes based on sorting and shortest paths problems. In Proceedings of 7th Conf. on Parallel Problem Solving from Nature (PPSN VII), number 2439 in LNCS, pages 54 63. Storch, T. (2006). How randomized search heuristics find maximum cliques in planar graphs. In Proceedings of the 8th annual conference on Genetic and evolutionary computation (GECCO 2006), pages 567 574, New York, NY, USA. ACM Press. Wegener, I. and Witt, C. (2005). On the analysis of a simple evolutionary algorithm on quadratic pseudo-boolean functions. Journal of Discrete Algorithms, 3(1):61 78. Witt, C. (2005). Worst-case and average-case approximations by simple randomized search heuristics. In In Proceedings of the 22nd Annual Symposium on Theoretical Aspects of Computer Science (STACS 05), number 3404 in LNCS, pages 44 56. Evolutionary Algorithms - Per Kristian Lehre 45/46

References IX Witt, C. (2006). Runtime Analysis of the (µ + 1) EA on Simple Pseudo-Boolean Functions. Evolutionary Computation, 14(1):65 86. Wolpert, D. H. and Macready, W. G. (1997). No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1):67 82. Zarges, C. (2009). On the utility of the population size for inversely fitness proportional mutation rates. In FOGA 09: Proceedings of the tenth ACM SIGEVO workshop on Foundations of genetic algorithms, pages 39 46, New York, NY, USA. ACM. Evolutionary Algorithms - Per Kristian Lehre 46/46