An Introduction to Evolutionary Algorithms Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology Karthik.sindhya@jyu.fi http://users.jyu.fi/~kasindhy/
Overview Nature Inspired Algorithms Differential Evolution algorithm Constraint handling Applications
Nature Inspired Algorithms Nature provide some of the efficient ways to solve problems Algorithms imitating processes in nature/inspired from nature Nature Inspired Algorithms. What type of problems? Aircraft wing design
Nature Inspired Algorithms Wind turbine design Bionic car BBC Performance improvement by 40%. They reduce turbulence across the surface, increasing angle of attack and decreasing drag. (Source: Popular Mechanics) Hexagonal plates - resulting in door paneling one-third lighter than conventional paneling, but just as strong. (Source: Popular Mechanics)
Nature Inspired Algorithms Bullet train NATGEO Train's nose is designed after the beak of a kingfisher, which dives smoothly into water. (Source: Popular Mechanics)
Nature Inspired Algorithms for Optimization Optimization An act, process, or methodology of making something (as a design, system, or decision) as fully perfect, functional, or effective as possible. (http://www.merriamwebster.com/dictionary) Nature as an optimizer Birds: Minimize drag. Humpback whale: Maximize maneuverability (enhanced lift devices to control flow over the flipper and maintain lift at high angles of attack). Boxfish: Minimize drag and maximize rigidity of exoskeleton. Kingfisher: Minimize micro-pressure waves. Consider an optimization problem of the form
Practical Optimization Problems Charecteristics! Objective and constraint functions can be nondifferentiable. Constraints nonlinear. Discrete/Discontinuous search space. Mixed variables (Integer, Real, Boolean etc.) Large number of constraints and variables. Objective functions can be multimodal. Multimodal functions have more than one optima, but can either have a single or more than one global optima. Computationally expensive objective functions and constraints.
Practical Optimization Problems Charecteristics! Decision vector Objective vector Simulation model Optimization algorithm
Traditional Optimization Techniques Problems! Different methods for different types of problems. Constraint handling e.g. using panalty method is sensitive to penalty parameters. Often get stuck in local optima (lack global perspective). Usually need knowledge of first/second order derivatives of objective functions and constraints.
Nature Inspired Algorithms for Optimization Nature inspired algorithms Computational intelligence Fuzzy logic systems Neural networks
Nature Inspired Algorithms for Optimization Nature inspired algorithms Evolutionary algorithms Swarm optimization Genetic algorithm Particle swarm optimization Differential evolution Ant colony optimization... and many more.
Evolution Humans Nokia Macintosh
Evolutionary Algorithms Offsprings created by reproduction, mutation, etc. Charles Darwin Natural selection - A guided search procedure Individuals suited to the environment survive, reproduce and pass their genetic traits to offspring Populations adapt to their environment. Variations accumulate over time to generate new species
Evolutionary Algorithms Terminologies 1. Individual - carrier of the genetic information (chromosome). It is characterized by its state in the search space, its fitness (objective function value). 2. Population - pool of individuals which allows the application of genetic operators. 3. Fitness function - The term fitness function is often used as a synonym for objective function. 4. Generation - (natural) time unit of the EA, an iteration step of an evolutionary algorithm.
Evolutionary Algorithms Population Individual Crossover Parents Offspring Mutation
Evolutionary Algorithms Step 1 t:= 0 Step 2 Step 3 Step 4 Initialize P(t) Evaluate P(t) While not terminate do P (t) := variation [P(t)]; evaluate [P (t)]; P(t+1) := select [P (t) U P(t)]; t := t + 1; od Evolutionary algorithms = Selection + Crossover + Mutation Reproduced from Evolutionary Computation: Comments on the History and Current State Bäack et. al
Evolutionary Algorithms Mean approaches optimum Variance reduces
Efficiency Evolutionary Algorithms Robustness = Breadth + Efficiency Robust scheme Random scheme Problem type (Goldberg, 1989)
Evolutionary Algorithms Selection - Roulette wheel, Tournement, steady state, etc. Motivation is to preserve the best (make multiple copies) and eliminate the worst Crossover simulated binary crossover, Linear crossover, blend crossover, etc. Create new solutions by considering more than one individual Global search for new and hopefully better solutions Mutation Polynomial mutation, random mutation, etc. Keep diversity in the population 010110 010100 (bit wise mutation)
Evolutionary Algorithms Tournament selection 23 30 24 24 37 24 24 11 11 9 30 9 37 9 9 11 23 11 Tournament 1 Tournament 2 37 30 Deleted from population
Evolutionary Algorithms Roulette wheel selection (proportional selection) Weaker solutions can survive.
Evolutionary Algorithms Concept of exploration vs exploitation. Exploration Search for promising solutions Crossover and mutation operators Exploitation preferring the good solutions Selection operator Excessive exploration Random search. Excessive exploitation Premature convergence.
Evolutionary Algorithms Exploration Exploitation Good evolutionary algorithm
Evolutionary Algorithms Classical gradient based algorithms Convergence to an optimal solution usually depends on the starting solution. Most algorithms tend to get stuck to a locally optimal solution. An algorithm efficient in solving one class of optimization problem may not be efficient in solving others. Algorithms cannot be easily parallelized. Evolutionary algorithms Convergence to an optimal solution is designed to be independent of initial population. A search based algorithm. Population helps not to get stuck to locally optimal solution. Can be applied to wide class of problems without major change in the algorithm. Can be easily parallelized.
Fitness Landscapes f(x) Using traditional gradient based methods Ideal and best case Multimodal f(x) x x f(x) f(x) Nightmare x Teaser x
Fitness Landscapes f(x) Using population based algorithms Ideal and best case Multimodal f(x) x x f(x) f(x) Nightmare x Teaser x
History of Evolutionary Algorithms GA: John Holland in 1962 (UMich) Evolutionary Strategy: Rechenberg and Schwefel in 1965 (Berlin) Evolutionary Programming: Larry Fogel in 1965 (California) First ICGA: 1985 in Carnegie Mellon University First GA book: Goldberg (1989) First FOGA workshop: 1990 in Indiana (Theory) First Fusion: 1990s (Evolutionary Algorithms) Journals: ECJ (MIT Press), IEEE TEC, Natural Computation (Elsevier) GECCO and CEC since 1999, PPSN since 1990 About 20 major conferences each year
Differential Evolution Proposed by R. Storn and K. Price (1997) Storn, R., Price, K. (1997). "Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces, Journal of Global Optimization 11: 341 359. A population based approach for minimization of functions A maximization function is converted to a minimization function (max f(x) = -min(f(x)) Parameters to be set NP, Population size (5 10) x number of variables F, Scaling factor [0,2] CR, Crossover ratio [0,1] NGEN, Maximum number of generations
Differential Evolution P 1 P 2 P 3 P 4 P 5 X 1 x 2 x 3 x 4 x 5 f 1 f 2 f 3 f 4 f 5 Target vector x 4 -x 5 Mutation v 1 = x 3 + F(x 4 -x 5 ) Trial vector C 1 C 2 C 3 C 4 C 5 X 1 x 2 x 3 x 4 x 5 X Y Y Rand < CR f 1 f 2 f 3 f 4 f 5 Z f I Z f II Y X 1 f I f I < f II N X 1 Crossover f II
Differential Evolution DE Scheme DE/x/y/z x: specifies the vector to be mutated which currently is rand. y: number of difference vectors used. z: denotes the crossover scheme. The current variant is bin. Also exp is available. DE/rand/1/bin
Differential Evolution x 2 Mutation Minimum x 4 x 5 x 3 v 1 Crossover v 1 = x 3 + F(x 4 -x 5 ) x 2 t 1 c 2 x 1 c 1 v 1 x 1
Constraint Handling Penalty parameter-less approach A feasible solution is preferred to infeasible solution When both solutions feasible, choose the solution with better function value When both solutions are infeasible, choose the solution with lower constraint violation
Constraint Handling Box constraints If variable is lower/higher than lower/upper bound, set to lower/upper bound A random value inside the bounds
Limitations of Evolutionary Algorithms No guarantee of finding an optimal solution in finite time Asymptotic convergence Containing a number of parameters Sometimes the result is highly dependent on the parameters set Self-adaptive parameters are commonly used Computationally very expensive Metamodels of functions are commonly used
Applications Application 1 Tracking suspect Caldwell and Johnston, 1991 Objective function: fitness rating on a nine point scale
Applications Optimization (Min/Max) of functions Airfoil optimization Evolving optimal structure Games