THE DEVELOPMENT OF THE POTENTIAL AND ACADMIC PROGRAMMES OF WROCLAW UNIVERISTY OF TECHNOLOGY METAHEURISTICS

Similar documents
Laboratorio di Algoritmi Genetici

Genetic Algorithm and Direct Search Toolbox 2 User s Guide

Genetic Algorithm and Direct Search Toolbox For Use with MATLAB

CT79 SOFT COMPUTING ALCCS-FEB 2014

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

Multi-objective Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

THE DEVELOPMENT OF THE POTENTIAL AND ACADMIC PROGRAMMES OF WROCLAW UNIVERISTY OF TECH- NOLOGY ITERATIVE LINEAR SOLVERS

METAHEURISTIC. Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal.

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Genetic Algorithm and Direct Search Toolbox

Optimization in MATLAB Seth DeLand

DERIVATIVE-FREE OPTIMIZATION

Appendix A MATLAB s Optimization Toolbox Algorithms

The Genetic Algorithm for finding the maxima of single-variable functions

Optimal Reactive Power Dispatch Using Hybrid Loop-Genetic Based Algorithm

Introduction to Optimization

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Introduction to Optimization

Introduction to Optimization

ABC Optimization: A Co-Operative Learning Approach to Complex Routing Problems

AN IMPROVED ITERATIVE METHOD FOR SOLVING GENERAL SYSTEM OF EQUATIONS VIA GENETIC ALGORITHMS

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Genetic Algorithms Variations and Implementation Issues

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms

OPTIMIZATION EVOLUTIONARY ALGORITHMS. Biologically-Inspired and. Computer Intelligence. Wiley. Population-Based Approaches to.

Optimization in Scilab

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

An Introduction to Evolutionary Algorithms

Santa Fe Trail Problem Solution Using Grammatical Evolution

Opportunistic Self Organizing Migrating Algorithm for Real-Time Dynamic Traveling Salesman Problem

Optimization of Makespan and Mean Flow Time for Job Shop Scheduling Problem FT06 Using ACO

ACONM: A hybrid of Ant Colony Optimization and Nelder-Mead Simplex Search

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

Chaos Genetic Algorithm Instead Genetic Algorithm

A Combinatorial Algorithm for The Cardinality Constrained Portfolio Optimization Problem

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman


CHAPTER 4 GENETIC ALGORITHM

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

Optimization of Benchmark Functions Using Genetic Algorithm

APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING

Chapter 14 Global Search Algorithms

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

Introduction to Design Optimization: Search Methods

Constrained Functions of N Variables: Non-Gradient Based Methods

PROJECT REPORT. Parallel Optimization in Matlab. Joakim Agnarsson, Mikael Sunde, Inna Ermilova Project in Computational Science: Report January 2013

Evolutionary Methods for State-based Testing

An Adaptive Genetic Algorithm for Solving N- Queens Problem

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007)

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

A Particle Swarm Optimization Algorithm for Solving Flexible Job-Shop Scheduling Problem

Chapter 3 Numerical Methods

Lecture Set 1B. S.D. Sudhoff Spring 2010

ATI Material Do Not Duplicate ATI Material. www. ATIcourses.com. www. ATIcourses.com

The Coral Reefs Optimization Algorithm: An Efficient Meta-heuristic for Solving Hard Optimization Problems

Using Genetic Algorithms to optimize ACS-TSP

METAHEURISTICS Genetic Algorithm

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

Hybrid Optimization Coupling Electromagnetism and Descent Search for Engineering Problems

Global Optimization. for practical engineering applications. Harry Lee 4/9/2018 CEE 696

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

Available online at ScienceDirect. Procedia CIRP 44 (2016 )

The Modified IWO Algorithm for Optimization of Numerical Functions

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Genetic Algorithms for Vision and Pattern Recognition

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

A Survey of Solving Approaches for Multiple Objective Flexible Job Shop Scheduling Problems

Duelist Algorithm: An Algorithm in Stochastic Optimization Method

SPATIAL OPTIMIZATION METHODS

CS5401 FS2015 Exam 1 Key

The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination

Numerical Optimization: Introduction and gradient-based methods

Available online at ScienceDirect. Razvan Cazacu*, Lucian Grama

Meta Heuristic and Evolutionary Algorithms for Engineering Optimization

Partial Opposition-based Learning Using Current Best Candidate Solution

Binary Differential Evolution Strategies

Genetic Algorithm using Theory of Chaos

Generation of Ultra Side lobe levels in Circular Array Antennas using Evolutionary Algorithms

Solving Optimization Problems with MATLAB Loren Shure

Introduction to Design Optimization: Search Methods

Extending MATLAB and GA to Solve Job Shop Manufacturing Scheduling Problems

Research Article A Comparative Analysis of Nature-Inspired Optimization Approaches to 2D Geometric Modelling for Turbomachinery Applications

MATLAB Solution of Linear Programming Problems

International Journal of Current Research and Modern Education (IJCRME) ISSN (Online): & Impact Factor: Special Issue, NCFTCCPS -

Optimal Design of a Parallel Beam System with Elastic Supports to Minimize Flexural Response to Harmonic Loading

Optimization of Function by using a New MATLAB based Genetic Algorithm Procedure

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

CMU-Q Lecture 9: Optimization II: Constrained,Unconstrained Optimization Convex optimization. Teacher: Gianni A. Di Caro

Genetic Algorithm for optimization using MATLAB

Study on GA-based matching method of railway vehicle wheels

A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

ACO and other (meta)heuristics for CO

Lab 4: Evolutionary Computation and Swarm Intelligence

Automata Construct with Genetic Algorithm

Transcription:

METAHEURISTICS 1. Objectives The goals of the laboratory workshop are as follows: to learn basic properties of evolutionary computation techniques and other metaheuristics for solving various global optimization problems, to study the selected evolutionary strategies, genetic algorithms, and some metaheuristics such as ACO, PSO, and IWO. to train the skills in coding global optimization algorithms in Matlab and using the Matlab s Genetic Algorithm and Direct Search Optimization Toolbox. The workshop is scheduled for 3 academic hours.. Introduction In computer science, metaheuristic designates a computational method that optimizes a problem by trying to improve iteratively a candidate solution with regard to a given measure of quality. Metaheuristics make few or no assumptions about the problem being optimized and can search very large spaces of candidate solutions. However, metaheuristics do not guarantee an optimal solution is ever found. Other terms having a similar meaning as metaheuristic are: derivative-free, direct search, black-box, or just heuristic optimizer. Many metaheuristics implement some form of stochastic optimization. Metaheuristics are used for combinatorial optimization in which an optimal solution is sought over a discrete search-space. Metaheuristics are also used for problems over realvalued search-spaces, where the classic way of optimization is to derive the gradient of the function to be optimized and then employ gradient descent or a quasi-newton method. Metaheuristics do not use the gradient or Hessian matrix so their advantage is that the function to be optimized need not be continuous or differentiable and it can also have constraints. 3. Preparation. The expected time needed for the preparation to this workshop is 9 hours. 3.1. Reading [1]. T. Back, Evolutionary Algorithms in Theory and Practice, New York, Oxford University Press, 1996, []. The Handbook of Evolutionary Computation, Editors: T. Back, D. B. Fogel, Z. Michalewicz, New York, Oxford University Press, 1996, Project co-financed by European Union within European Social Fund 1

[3]. T. Weise, Global Optimization Algorithms Theory and Application, e-book, 009, http://www.it-weise.de/ [4]. A. R. Mehrabian, C. Lucas, A Novel Numerical Optimization Algorithm Inspired from Weed Colonization, Ecological Informatics, Vol. 1, No. 4, 006, pp. 355 366, [5]. J. Arabas, Wykłady z algorytmów ewolucyjnych, WNT, 004. 3.. Problems At the beginning of the laboratory workshop each student should know the answers to the following questions: What is a NP-hard problem? What is a fitness function? What are fundamental groups of evolutionary strategies? What is a chromosome or genotype? What is a phenotype? What are the basic genetic operators? What are the roulette-wheel and tournament selections? What is the elitist strategy? What are typical crossover techniques? How to apply the evolutionary algorithms to constrained optimization problems? What is a fundamental strategy in the Simulated Annealing (SA) algorithm? What is a fundamental strategy in the Tabu-Search (TS) algorithm? What is a fundamental strategy in the Ant Colony Optimization (ACO) algorithm? What is a fundamental strategy in the Particle Swarm Optimization (PSO) algorithm? What is a fundamental strategy in the Invasive Weed Optimization (IWO) algorithm? 3.3. Detailed preparation Each group of students ( 3 persons) is expected to accomplish the following tasks: 1. formulate LP problems in the matrix form for some typical engineering problems (listed below),. code the selected global optimization algorithms in Matlab, 3. compare the results (convergence rate, elapsed time, etc.) obtained with the coded algorithms and with the functions ga(.), gamultiobj(.), and simulannealbnd(.) in Matlab, 4. draw the conclusions. Project co-financed by European Union within European Social Fund

Problems to be modeled The following problems should be solved with the selected global optimization algorithms. Problem 1: Find the global minimum of the Griewank function for n =, 3, : n n 1 xi f ( x) = xi cos 1 4000 i= 1 i= 1 i +. Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Problem : Find the global minimum of the Rastrigin function for n =, 3, : n i= 1 ( i 10 cos i 10) f ( x) = x ( π x ) +. Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Problem 3: Find the global minimum of the function: f ( x) x exp for x 0 = 0 exp{ 1} + ( x 0)( x ) for x > 0 for x [ 15 4]. Illustrate this function graphically, and show its fitness versus evolutionary process. Problem 4: Find the global minimum of the Rosenbrock's function: ( ) = ( ) + ( ) f x 100 x x 1 x, 1 1 s.t. xx 1 + x1 x + 1.5 0, 10 xx 0, 1 0 x 1, 1 0 x 13, Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Problem 5: Find the global minima of the following function: Project co-financed by European Union within European Social Fund 3

4 x ( ) 1 f x = 4.1x1 + x1 + x1x + ( 4x 4) x, 3 for 3 x1 3 and x. Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Problem 6: Find the global minimum of the Easom function: { } ( ) cos( ) cos( ) exp ( π) ( π) f x = x x x x, 1 1 for 100 x i 100, i = 1,. Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Problem 7: Find the global minimum of the modified Sombrero function: n cos xi i= 1 f ( x) =, n 4 xi + 1 i= 1 for 10 x i 10, i = 1,. Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Algorithms to be coded Students are expected to code the following algorithms and apply them to the abovementioned global optimization problems. Algorithm 1: The evolutionary strategy (1+1), [1], Chapter, Section.1.7, pp. 83 84, Algorithm 5, Algorithm : The evolutionary strategy ( μ, λ ), [1], Chapter, Section.1.6, pp. 81 83, Algorithm 4, Algorithm 3: The evolutionary strategy ( μ + λ ), [1], Chapter, Section.1.6, pp. 81 83, Algorithm 4, Algorithm 4: The genetic algorithm (general form), [1], Chapter, Section.3.6, pp. 11 13, Algorithm 8. Test various options: encoding: binary or gray; selection: roulette-wheel, linear ranking, stochastic universal sampling or tournament; crossover: one-point, two-point, multi-point, uniform; mutation: with or without binary mask; elitism or pure succession, using different values of the parameters (e.g. probability of mutations, size of the base population, etc.). Algorithm 5: The PSO algorithm: [3], Chapter 9, pp. 49 5, Algorithm 9.1, Project co-financed by European Union within European Social Fund 4

Algorithm 6: The SA algorithm: [3], Chapter 1, pp. 63 67, Algorithm 1.1, Algorithm 7: The TS algorithm: [3], Chapter 14, pp. 73 75, Algorithm 14.1, Algorithm 8: The IWO algorithm: [4], pp. 355 358. 4. Content of report The report should contain: introductory page, detailed mathematical description of the analyzed problems, a basic description of the coded algorithms, the Matlab code (together with the detailed end-line comments) of the analyzed algorithm, the results obtained with the coded algorithms, the results obtained with the Matlab functions included in the selected Matlab toolboxes, conclusions The section Results should present the final solutions (for a D case the final solution marked on the feasible set), the convergence behavior (e.g. error norm between the approximate solution and the exact one or residuals), the elapsed time, and the comparison to the results obtained with the functions included in the selected Matlab s toolboxes. 5. Appendix: The Appendix contains the syntax of the ga function. Syntax GA attempts to solve problems of the form: min F(X) subject to: A*X <= B, Aeq*X = Beq (linear constraints) X C(X) <= 0, Ceq(X) = 0 (nonlinear constraints) LB <= X <= ub X = GA(FITNESSFCN,NVARS) finds a local unconstrained minimum X to the FITNESSFCN using GA. NVARS is the dimension (number of design variables) of the FITNESSFCN. FITNESSFCN accepts a vector X of size 1-by-NVARS, and returns a scalar evaluated at X. X = GA(FITNESSFCN,NVARS,A,b) finds a local minimum X to the function FITNESSFCN, subject to the linear inequalities A*X <= B. Linear constraints are not satisfied when the PopulationType option is set to 'bitstring' or 'custom'. See the documentation for details. Project co-financed by European Union within European Social Fund 5

X = GA(FITNESSFCN,NVARS,A,b,Aeq,beq) finds a local minimum X to the function FITNESSFCN, subject to the linear equalities Aeq*X = beq as well as A*X <= B. (Set A=[] and B=[] if no inequalities exist.) Linear constraints are not satisfied when the PopulationType option is set to 'bitstring' or 'custom'. See the documentation for details. X = GA(FITNESSFCN,NVARS,A,b,Aeq,beq,lb,ub) defines a set of lower and upper bounds on the design variables, X, so that a solution is found in the range lb <= X <= ub. Use empty matrices for lb and ub if no bounds exist. Set lb(i) = -Inf if X(i) is unbounded below; set ub(i) = Inf if X(i) is unbounded above. Linear constraints are not satisfied when the PopulationType option is set to 'bitstring' or 'custom'. See the documentation for details. X = GA(FITNESSFCN,NVARS,A,b,Aeq,beq,lb,ub,NONLCON) subjects the minimization to the constraints defined in NONLCON. The function NONLCON accepts X and returns the vectors C and Ceq, representing the nonlinear inequalities and equalities respectively. GA minimizes FITNESSFCN such that C(X)<=0 and Ceq(X)=0. (Set lb=[] and/or ub=[] if no bounds exist.) Nonlinear constraints are not satisfied when the PopulationType option is set to 'bitstring' or 'custom'. See the documentation for details. X = GA(FITNESSFCN,NVARS,A,b,Aeq,beq,lb,ub,NONLCON,options) minimizes with the default optimization parameters replaced by values in the structure OPTIONS. OPTIONS can be created with the GAOPTIMSET function. X = GA(PROBLEM) finds the minimum for PROBLEM. PROBLEM is a structure that has the following fields: fitnessfcn: <Fitness function> nvars: <Number of design variables> Aineq: <A matrix for inequality constraints> bineq: <b vector for inequality constraints> Aeq: <Aeq matrix for equality constraints> beq: <beq vector for equality constraints> lb: <Lower bound on X> ub: <Upper bound on X> nonlcon: <nonlinear constraint function> options: <Options structure created with GAOPTIMSET> randstate: <State of the uniform random number generator> randnstate: <State of the normal random number generator> [X,FVAL] = GA(FITNESSFCN,...) returns FVAL, the value of the fitness function FITNESSFCN at the solution X. [X,FVAL,EXITFLAG] = GA(FITNESSFCN,...) returns EXITFLAG which describes the exit condition of GA. Possible values of EXITFLAG and the corresponding exit conditions are 1 Average change in value of the fitness function over options.stallgenlimit generations less than options.tolfun and constraint violation less than options.tolcon. Project co-financed by European Union within European Social Fund 6

3 The value of the fitness function did not change in options.stallgenlimit generations and constraint violation less than options.tolcon. 4 Magnitude of step smaller than machine precision and constraint violation less than options.tolcon. This exit condition applies only to nonlinear constraints. 5 Fitness limit reached and constraint violation less than options.tolcon. 0 Maximum number of generations exceeded. -1 Optimization terminated by the output or plot function. - No feasible point found. -4 Stall time limit exceeded. -5 Time limit exceeded. [X,FVAL,EXITFLAG,OUTPUT] = GA(FITNESSFCN,...) returns a structure OUTPUT with the following information: randstate: <State of the function RAND used before GA started> randnstate: <State of the function RANDN used before GA started> generations: <Total generations, excluding HybridFcn iterations> funccount: <Total function evaluations> maxconstraint: <Maximum constraint violation>, if any message: <GA termination message> [X,FVAL,EXITFLAG,OUTPUT,POPULATION] = GA(FITNESSFCN,...) returns the final POPULATION at termination. [X,FVAL,EXITFLAG,OUTPUT,POPULATION,SCORES] = GA(FITNESSFCN,...) returns the SCORES of the final POPULATION. Example: Unconstrained minimization of 'rastriginsfcn' fitness function of numberofvariables = x = ga(@rastriginsfcn,) Display plotting functions while GA minimizes options = gaoptimset('plotfcns',... {@gaplotbestf,@gaplotbestindiv,@gaplotexpectation,@gaplotstopping}); [x,fval,exitflag,output] = ga(@rastriginsfcn,,[],[],[],[],[],[],[],options) An example with inequality constraints and lower bounds A = [1 1; -1 ; 1]; b = [; ; 3]; lb = zeros(,1); % Use mutation function which can handle constraints options = gaoptimset('mutationfcn',@mutationadaptfeasible); [x,fval,exitflag] = ga(@lincontest6,,a,b,[],[],lb,[],[],options); FITNESSFCN can also be an anonymous function: x = ga(@(x) 3*sin(x(1))+exp(x()),) If FITNESSFCN or NONLCON are parameterized, you can use anonymous functions to capture the problem-dependent parameters. Suppose you want to minimize the fitness given in the function myfit, subject to the nonlinear constraint myconstr, where these two functions are parameterized by their second argument a1 and a, respectively. Here myfit and myconstr are M-file functions such as Project co-financed by European Union within European Social Fund 7

and function f = myfit(x,a1) f = exp(x(1))*(4*x(1)^ + *x()^ + 4*x(1)*x() + *x() + a1); function [c,ceq] = myconstr(x,a) c = [1.5 + x(1)*x() - x(1) - x(); -x(1)*x() - a]; % No nonlinear equality constraints: ceq = []; To optimize for specific values of a1 and a, first assign the values to these two parameters. Then create two one-argument anonymous functions that capture the values of a1 and a, and call myfit and myconstr with two arguments. Finally, pass these anonymous functions to GA: a1 = 1; a = 10; % define parameters first % Mutation function for constrained minimization options = gaoptimset('mutationfcn',@mutationadaptfeasible); x = ga(@(x)myfit(x,a1),,[],[],[],[],[],[],@(x)myconstr(x,a),options) Project co-financed by European Union within European Social Fund 8