METAHEURISTICS 1. Objectives The goals of the laboratory workshop are as follows: to learn basic properties of evolutionary computation techniques and other metaheuristics for solving various global optimization problems, to study the selected evolutionary strategies, genetic algorithms, and some metaheuristics such as ACO, PSO, and IWO. to train the skills in coding global optimization algorithms in Matlab and using the Matlab s Genetic Algorithm and Direct Search Optimization Toolbox. The workshop is scheduled for 3 academic hours.. Introduction In computer science, metaheuristic designates a computational method that optimizes a problem by trying to improve iteratively a candidate solution with regard to a given measure of quality. Metaheuristics make few or no assumptions about the problem being optimized and can search very large spaces of candidate solutions. However, metaheuristics do not guarantee an optimal solution is ever found. Other terms having a similar meaning as metaheuristic are: derivative-free, direct search, black-box, or just heuristic optimizer. Many metaheuristics implement some form of stochastic optimization. Metaheuristics are used for combinatorial optimization in which an optimal solution is sought over a discrete search-space. Metaheuristics are also used for problems over realvalued search-spaces, where the classic way of optimization is to derive the gradient of the function to be optimized and then employ gradient descent or a quasi-newton method. Metaheuristics do not use the gradient or Hessian matrix so their advantage is that the function to be optimized need not be continuous or differentiable and it can also have constraints. 3. Preparation. The expected time needed for the preparation to this workshop is 9 hours. 3.1. Reading [1]. T. Back, Evolutionary Algorithms in Theory and Practice, New York, Oxford University Press, 1996, []. The Handbook of Evolutionary Computation, Editors: T. Back, D. B. Fogel, Z. Michalewicz, New York, Oxford University Press, 1996, Project co-financed by European Union within European Social Fund 1
[3]. T. Weise, Global Optimization Algorithms Theory and Application, e-book, 009, http://www.it-weise.de/ [4]. A. R. Mehrabian, C. Lucas, A Novel Numerical Optimization Algorithm Inspired from Weed Colonization, Ecological Informatics, Vol. 1, No. 4, 006, pp. 355 366, [5]. J. Arabas, Wykłady z algorytmów ewolucyjnych, WNT, 004. 3.. Problems At the beginning of the laboratory workshop each student should know the answers to the following questions: What is a NP-hard problem? What is a fitness function? What are fundamental groups of evolutionary strategies? What is a chromosome or genotype? What is a phenotype? What are the basic genetic operators? What are the roulette-wheel and tournament selections? What is the elitist strategy? What are typical crossover techniques? How to apply the evolutionary algorithms to constrained optimization problems? What is a fundamental strategy in the Simulated Annealing (SA) algorithm? What is a fundamental strategy in the Tabu-Search (TS) algorithm? What is a fundamental strategy in the Ant Colony Optimization (ACO) algorithm? What is a fundamental strategy in the Particle Swarm Optimization (PSO) algorithm? What is a fundamental strategy in the Invasive Weed Optimization (IWO) algorithm? 3.3. Detailed preparation Each group of students ( 3 persons) is expected to accomplish the following tasks: 1. formulate LP problems in the matrix form for some typical engineering problems (listed below),. code the selected global optimization algorithms in Matlab, 3. compare the results (convergence rate, elapsed time, etc.) obtained with the coded algorithms and with the functions ga(.), gamultiobj(.), and simulannealbnd(.) in Matlab, 4. draw the conclusions. Project co-financed by European Union within European Social Fund
Problems to be modeled The following problems should be solved with the selected global optimization algorithms. Problem 1: Find the global minimum of the Griewank function for n =, 3, : n n 1 xi f ( x) = xi cos 1 4000 i= 1 i= 1 i +. Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Problem : Find the global minimum of the Rastrigin function for n =, 3, : n i= 1 ( i 10 cos i 10) f ( x) = x ( π x ) +. Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Problem 3: Find the global minimum of the function: f ( x) x exp for x 0 = 0 exp{ 1} + ( x 0)( x ) for x > 0 for x [ 15 4]. Illustrate this function graphically, and show its fitness versus evolutionary process. Problem 4: Find the global minimum of the Rosenbrock's function: ( ) = ( ) + ( ) f x 100 x x 1 x, 1 1 s.t. xx 1 + x1 x + 1.5 0, 10 xx 0, 1 0 x 1, 1 0 x 13, Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Problem 5: Find the global minima of the following function: Project co-financed by European Union within European Social Fund 3
4 x ( ) 1 f x = 4.1x1 + x1 + x1x + ( 4x 4) x, 3 for 3 x1 3 and x. Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Problem 6: Find the global minimum of the Easom function: { } ( ) cos( ) cos( ) exp ( π) ( π) f x = x x x x, 1 1 for 100 x i 100, i = 1,. Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Problem 7: Find the global minimum of the modified Sombrero function: n cos xi i= 1 f ( x) =, n 4 xi + 1 i= 1 for 10 x i 10, i = 1,. Illustrate this function graphically on a D contour plot and a 3D surface plot. Show its fitness versus evolutionary process. Algorithms to be coded Students are expected to code the following algorithms and apply them to the abovementioned global optimization problems. Algorithm 1: The evolutionary strategy (1+1), [1], Chapter, Section.1.7, pp. 83 84, Algorithm 5, Algorithm : The evolutionary strategy ( μ, λ ), [1], Chapter, Section.1.6, pp. 81 83, Algorithm 4, Algorithm 3: The evolutionary strategy ( μ + λ ), [1], Chapter, Section.1.6, pp. 81 83, Algorithm 4, Algorithm 4: The genetic algorithm (general form), [1], Chapter, Section.3.6, pp. 11 13, Algorithm 8. Test various options: encoding: binary or gray; selection: roulette-wheel, linear ranking, stochastic universal sampling or tournament; crossover: one-point, two-point, multi-point, uniform; mutation: with or without binary mask; elitism or pure succession, using different values of the parameters (e.g. probability of mutations, size of the base population, etc.). Algorithm 5: The PSO algorithm: [3], Chapter 9, pp. 49 5, Algorithm 9.1, Project co-financed by European Union within European Social Fund 4
Algorithm 6: The SA algorithm: [3], Chapter 1, pp. 63 67, Algorithm 1.1, Algorithm 7: The TS algorithm: [3], Chapter 14, pp. 73 75, Algorithm 14.1, Algorithm 8: The IWO algorithm: [4], pp. 355 358. 4. Content of report The report should contain: introductory page, detailed mathematical description of the analyzed problems, a basic description of the coded algorithms, the Matlab code (together with the detailed end-line comments) of the analyzed algorithm, the results obtained with the coded algorithms, the results obtained with the Matlab functions included in the selected Matlab toolboxes, conclusions The section Results should present the final solutions (for a D case the final solution marked on the feasible set), the convergence behavior (e.g. error norm between the approximate solution and the exact one or residuals), the elapsed time, and the comparison to the results obtained with the functions included in the selected Matlab s toolboxes. 5. Appendix: The Appendix contains the syntax of the ga function. Syntax GA attempts to solve problems of the form: min F(X) subject to: A*X <= B, Aeq*X = Beq (linear constraints) X C(X) <= 0, Ceq(X) = 0 (nonlinear constraints) LB <= X <= ub X = GA(FITNESSFCN,NVARS) finds a local unconstrained minimum X to the FITNESSFCN using GA. NVARS is the dimension (number of design variables) of the FITNESSFCN. FITNESSFCN accepts a vector X of size 1-by-NVARS, and returns a scalar evaluated at X. X = GA(FITNESSFCN,NVARS,A,b) finds a local minimum X to the function FITNESSFCN, subject to the linear inequalities A*X <= B. Linear constraints are not satisfied when the PopulationType option is set to 'bitstring' or 'custom'. See the documentation for details. Project co-financed by European Union within European Social Fund 5
X = GA(FITNESSFCN,NVARS,A,b,Aeq,beq) finds a local minimum X to the function FITNESSFCN, subject to the linear equalities Aeq*X = beq as well as A*X <= B. (Set A=[] and B=[] if no inequalities exist.) Linear constraints are not satisfied when the PopulationType option is set to 'bitstring' or 'custom'. See the documentation for details. X = GA(FITNESSFCN,NVARS,A,b,Aeq,beq,lb,ub) defines a set of lower and upper bounds on the design variables, X, so that a solution is found in the range lb <= X <= ub. Use empty matrices for lb and ub if no bounds exist. Set lb(i) = -Inf if X(i) is unbounded below; set ub(i) = Inf if X(i) is unbounded above. Linear constraints are not satisfied when the PopulationType option is set to 'bitstring' or 'custom'. See the documentation for details. X = GA(FITNESSFCN,NVARS,A,b,Aeq,beq,lb,ub,NONLCON) subjects the minimization to the constraints defined in NONLCON. The function NONLCON accepts X and returns the vectors C and Ceq, representing the nonlinear inequalities and equalities respectively. GA minimizes FITNESSFCN such that C(X)<=0 and Ceq(X)=0. (Set lb=[] and/or ub=[] if no bounds exist.) Nonlinear constraints are not satisfied when the PopulationType option is set to 'bitstring' or 'custom'. See the documentation for details. X = GA(FITNESSFCN,NVARS,A,b,Aeq,beq,lb,ub,NONLCON,options) minimizes with the default optimization parameters replaced by values in the structure OPTIONS. OPTIONS can be created with the GAOPTIMSET function. X = GA(PROBLEM) finds the minimum for PROBLEM. PROBLEM is a structure that has the following fields: fitnessfcn: <Fitness function> nvars: <Number of design variables> Aineq: <A matrix for inequality constraints> bineq: <b vector for inequality constraints> Aeq: <Aeq matrix for equality constraints> beq: <beq vector for equality constraints> lb: <Lower bound on X> ub: <Upper bound on X> nonlcon: <nonlinear constraint function> options: <Options structure created with GAOPTIMSET> randstate: <State of the uniform random number generator> randnstate: <State of the normal random number generator> [X,FVAL] = GA(FITNESSFCN,...) returns FVAL, the value of the fitness function FITNESSFCN at the solution X. [X,FVAL,EXITFLAG] = GA(FITNESSFCN,...) returns EXITFLAG which describes the exit condition of GA. Possible values of EXITFLAG and the corresponding exit conditions are 1 Average change in value of the fitness function over options.stallgenlimit generations less than options.tolfun and constraint violation less than options.tolcon. Project co-financed by European Union within European Social Fund 6
3 The value of the fitness function did not change in options.stallgenlimit generations and constraint violation less than options.tolcon. 4 Magnitude of step smaller than machine precision and constraint violation less than options.tolcon. This exit condition applies only to nonlinear constraints. 5 Fitness limit reached and constraint violation less than options.tolcon. 0 Maximum number of generations exceeded. -1 Optimization terminated by the output or plot function. - No feasible point found. -4 Stall time limit exceeded. -5 Time limit exceeded. [X,FVAL,EXITFLAG,OUTPUT] = GA(FITNESSFCN,...) returns a structure OUTPUT with the following information: randstate: <State of the function RAND used before GA started> randnstate: <State of the function RANDN used before GA started> generations: <Total generations, excluding HybridFcn iterations> funccount: <Total function evaluations> maxconstraint: <Maximum constraint violation>, if any message: <GA termination message> [X,FVAL,EXITFLAG,OUTPUT,POPULATION] = GA(FITNESSFCN,...) returns the final POPULATION at termination. [X,FVAL,EXITFLAG,OUTPUT,POPULATION,SCORES] = GA(FITNESSFCN,...) returns the SCORES of the final POPULATION. Example: Unconstrained minimization of 'rastriginsfcn' fitness function of numberofvariables = x = ga(@rastriginsfcn,) Display plotting functions while GA minimizes options = gaoptimset('plotfcns',... {@gaplotbestf,@gaplotbestindiv,@gaplotexpectation,@gaplotstopping}); [x,fval,exitflag,output] = ga(@rastriginsfcn,,[],[],[],[],[],[],[],options) An example with inequality constraints and lower bounds A = [1 1; -1 ; 1]; b = [; ; 3]; lb = zeros(,1); % Use mutation function which can handle constraints options = gaoptimset('mutationfcn',@mutationadaptfeasible); [x,fval,exitflag] = ga(@lincontest6,,a,b,[],[],lb,[],[],options); FITNESSFCN can also be an anonymous function: x = ga(@(x) 3*sin(x(1))+exp(x()),) If FITNESSFCN or NONLCON are parameterized, you can use anonymous functions to capture the problem-dependent parameters. Suppose you want to minimize the fitness given in the function myfit, subject to the nonlinear constraint myconstr, where these two functions are parameterized by their second argument a1 and a, respectively. Here myfit and myconstr are M-file functions such as Project co-financed by European Union within European Social Fund 7
and function f = myfit(x,a1) f = exp(x(1))*(4*x(1)^ + *x()^ + 4*x(1)*x() + *x() + a1); function [c,ceq] = myconstr(x,a) c = [1.5 + x(1)*x() - x(1) - x(); -x(1)*x() - a]; % No nonlinear equality constraints: ceq = []; To optimize for specific values of a1 and a, first assign the values to these two parameters. Then create two one-argument anonymous functions that capture the values of a1 and a, and call myfit and myconstr with two arguments. Finally, pass these anonymous functions to GA: a1 = 1; a = 10; % define parameters first % Mutation function for constrained minimization options = gaoptimset('mutationfcn',@mutationadaptfeasible); x = ga(@(x)myfit(x,a1),,[],[],[],[],[],[],@(x)myconstr(x,a),options) Project co-financed by European Union within European Social Fund 8