Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm. Andreas Janecek

Similar documents
Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

Adaptative Clustering Particle Swarm Optimization

Ying TAN. Peking University, China. Fireworks Algorithm (FWA) for Optimization. Ying TAN. Introduction. Conventional FWA.

Particle Swarm Optimization

Iterative Improvement of the Multiplicative Update NMF Algorithm using Nature-inspired Optimization

Assessing Particle Swarm Optimizers Using Network Science Metrics

Modified Particle Swarm Optimization

An Empirical Study on Influence of Approximation Approaches on Enhancing Fireworks Algorithm

Using Interval Scanning to Improve the Performance of the Enhanced Unidimensional Search Method

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization

Exploration vs. Exploitation in Differential Evolution

Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization

A Firework Algorithm for Solving Capacitated Vehicle Routing Problem

Unidimensional Search for solving continuous high-dimensional optimization problems

The Modified IWO Algorithm for Optimization of Numerical Functions

CZECH TECHNICAL UNIVERSITY IN PRAGUE FACULTY OF ELECTRICAL ENGINEERING DEPARTMENT OF CYBERNETICS

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

Automatic differentiation based for particle swarm optimization steepest descent direction

EE 553 Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over

ABC Optimization: A Co-Operative Learning Approach to Complex Routing Problems

Artificial Fish Swarm Algorithm for Unconstrained Optimization Problems

A Polar Coordinate Particle Swarm Optimiser

A Computation Time Comparison of Self-Organising Migrating Algorithm in Java and C#

Binary Differential Evolution Strategies

ACONM: A hybrid of Ant Colony Optimization and Nelder-Mead Simplex Search

Artificial bee colony algorithm with multiple onlookers for constrained optimization problems

Offspring Generation Method using Delaunay Triangulation for Real-Coded Genetic Algorithms

Adaptive Spiral Optimization Algorithm for Benchmark Problems

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

Small World Particle Swarm Optimizer for Global Optimization Problems

A Gaussian Firefly Algorithm

Université Libre de Bruxelles

Cooperative Coevolution using The Brain Storm Optimization Algorithm

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Immune Optimization Design of Diesel Engine Valve Spring Based on the Artificial Fish Swarm

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Modified Particle Swarm Optimization with Novel Modulated Inertia for Velocity Update

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

Hybrid of Genetic Algorithm and Continuous Ant Colony Optimization for Optimum Solution

On the Idea of a New Artificial Intelligence Based Optimization Algorithm Inspired From the Nature of Vortex

Small World Network Based Dynamic Topology for Particle Swarm Optimization

Distributed Probabilistic Model-Building Genetic Algorithm

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

NOVEL CONSTRAINED SEARCH-TACTIC FOR OPTIMAL DYNAMIC ECONOMIC DISPATCH USING MODERN META-HEURISTIC OPTIMIZATION ALGORITHMS

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

Particle Swarm Optimization applied to Pattern Recognition

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

Innovative Strategy of SOMA Control Parameter Setting

HPSOM: A HYBRID PARTICLE SWARM OPTIMIZATION ALGORITHM WITH GENETIC MUTATION. Received February 2012; revised June 2012

Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances

Multi-objective Optimization Algorithm based on Magnetotactic Bacterium

Opportunistic Self Organizing Migrating Algorithm for Real-Time Dynamic Traveling Salesman Problem

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

Adaptive Radiation Pattern Optimization for Antenna Arrays by Phase Perturbations using Particle Swarm Optimization

Experiments with Firefly Algorithm

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem

A Hybrid Metaheuristic Based on Differential Evolution and Local Search with Quadratic Interpolation

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an

Classification of Soil and Vegetation by Fuzzy K-means Classification and Particle Swarm Optimization

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

Navigation of Multiple Mobile Robots Using Swarm Intelligence

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Cooperative Co-evolution with Delta Grouping for Large Scale Non-separable Function Optimization

A nodal based evolutionary structural optimisation algorithm

1.2 Numerical Solutions of Flow Problems

THE EFFECT OF SEGREGATION IN NON- REPEATED PRISONER'S DILEMMA

Efficient Robust Shape Optimization for Crashworthiness

HARD, SOFT AND FUZZY C-MEANS CLUSTERING TECHNIQUES FOR TEXT CLASSIFICATION

Optimization Using Particle Swarms with Near Neighbor Interactions

An Investigation into Iterative Methods for Solving Elliptic PDE s Andrew M Brown Computer Science/Maths Session (2000/2001)

The Journal of MacroTrends in Technology and Innovation

A THREAD BUILDING BLOCKS BASED PARALLEL GENETIC ALGORITHM

Graphical Approach to Solve the Transcendental Equations Salim Akhtar 1 Ms. Manisha Dawra 2

Discrete Particle Swarm Optimization for Solving a Single to Multiple Destinations in Evacuation Planning

Term Paper for EE 680 Computer Aided Design of Digital Systems I Timber Wolf Algorithm for Placement. Imran M. Rizvi John Antony K.

PATCH TEST OF HEXAHEDRAL ELEMENT

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Modified Shuffled Frog-leaping Algorithm with Dimension by Dimension Improvement

Central Manufacturing Technology Institute, Bangalore , India,

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

QCA & CQCA: Quad Countries Algorithm and Chaotic Quad Countries Algorithm

A Dynamical Systems Analysis of Collaboration Methods in Cooperative Co-evolution.

Committee-based Active Learning for Surrogate-Assisted Particle Swarm Optimization of Expensive Problems

Active contour: a parallel genetic algorithm approach

Fully Automatic Methodology for Human Action Recognition Incorporating Dynamic Information

Particle Swarm Optimization

Particle Swarm Optimizer for Finding Robust Optima. J.K. Vis

The 100-Digit Challenge:

Multi-Objective Optimization Using Genetic Algorithms

Individual Parameter Selection Strategy for Particle Swarm Optimization

Transcription:

Feeding the Fish Weight for the Fish School Search Algorithm Andreas Janecek andreas.janecek@univie.ac.at International Conference on Swarm Intelligence (ICSI) Chongqing, China - Jun 14, 2011

Outline Basic Fish School Search 1 Basic Fish School Search Motivation Basic Structure Operators 2 Weight Non-linear Step Size Decrease Strategy Combined Strategy 3 Fitness per Iteration Final Results

Overview Basic Fish School Search Motivation Basic Structure Operators What is Fish School Search? FSS is a recently developed swarm intelligence algorithm based on the social behavior of schools of fish Connection to Fish Swarms in Biology By living in swarms, the fish improve survivability of the whole group due to mutual protection against enemies The fish perform collective tasks in order to achieve synergy (e.g. finding locations with lots of food) Comparable to real fish that swim in the aquarium in order to find food, the artificial fish search (swim) the search space (aquarium) for the best solutions (locations with most food)

Overview Basic Fish School Search Motivation Basic Structure Operators What is Fish School Search? FSS is a recently developed swarm intelligence algorithm based on the social behavior of schools of fish Connection to Fish Swarms in Biology By living in swarms, the fish improve survivability of the whole group due to mutual protection against enemies The fish perform collective tasks in order to achieve synergy (e.g. finding locations with lots of food) Comparable to real fish that swim in the aquarium in order to find food, the artificial fish search (swim) the search space (aquarium) for the best solutions (locations with most food)

Overview Basic Fish School Search Motivation Basic Structure Operators Inventors Carmelo J. A. Bastos Filho and Fernando Buarque de Lima Neto Computational Intelligence Research Group (CIRG) at University of Pernambuco, Recife-PE, Brazil First Publications BASTOS-FILHO; LIMA NETO, et al. Fish School Search: An Overview. In: CHIONG, Raymond (Ed.). Nature-Inspired Algorithms for Optimisation. Series: Studies in Computational Intelligence, Vol. 193.. pp. 261-277. Berlin: Springer, 2009 BASTOS-FILHO; LIMA NETO, et al. On the Influence of the Swimming Operators in the Fish School Search Algorithm. In: IEEE International Conference on Systems, Man, and Cybernetics - SMC2009, 2009, San Antonio, USA.

Overview Basic Fish School Search Motivation Basic Structure Operators Inventors Carmelo J. A. Bastos Filho and Fernando Buarque de Lima Neto Computational Intelligence Research Group (CIRG) at University of Pernambuco, Recife-PE, Brazil First Publications BASTOS-FILHO; LIMA NETO, et al. Fish School Search: An Overview. In: CHIONG, Raymond (Ed.). Nature-Inspired Algorithms for Optimisation. Series: Studies in Computational Intelligence, Vol. 193.. pp. 261-277. Berlin: Springer, 2009 BASTOS-FILHO; LIMA NETO, et al. On the Influence of the Swimming Operators in the Fish School Search Algorithm. In: IEEE International Conference on Systems, Man, and Cybernetics - SMC2009, 2009, San Antonio, USA.

Properties of FSS Basic Fish School Search Motivation Basic Structure Operators Autonomy no global information needs to be stored Autonomy only local computations Low communications minimum centralized control Scalability / Parallelism Non-monotonicity distinct diversity mechanisms Simple computations

Outline Basic Fish School Search Motivation Basic Structure Operators Basic Structure

FSS Pseudo Code Basic Fish School Search Motivation Basic Structure Operators initialize randomly all fish; while stop criterion is not met do for each fish do individual movement + evaluate fitness function; feeding operator; for each fish do instinctive movement; Calculate barycentre; for each fish do volitive movement; evaluate fitness function; update step ind

FSS Pseudo Code Operators initialize randomly all fish; while stop criterion is not met do for each fish do individual movement + evaluate fitness function; feeding operator; for each fish do instinctive movement; Calculate barycentre; for each fish do volitive movement; evaluate fitness function; update step ind Motivation Basic Structure Operators Feeding Operator : Updates the fish weight according to successfulness of the current movement

FSS Pseudo Code Operators initialize randomly all fish; while stop criterion is not met do for each fish do individual movement + evaluate fitness function; feeding operator; for each fish do instinctive movement; Calculate barycentre; for each fish do volitive movement; evaluate fitness function; update step ind Motivation Basic Structure Operators Feeding Operator : Updates the fish weight according to successfulness of the current movement Swimming Operators : Move the fish according to feeding operator 1 Individual movement 2 Collective instinctive movem. 3 Collective volitive movement

Variables Basic Fish School Search Motivation Basic Structure Operators Population size (size of fish school): pop (1 i pop) Problem dimension: dim (1 j dim) Time (i.e. iteration): t Weight of fish i: w i Position of fish i: x i Fitness of fish i: f ( x i )

Outline Basic Fish School Search Motivation Basic Structure Operators Operators Jump directly to new update strategies

FSS Pseudo Code Basic Fish School Search Motivation Basic Structure Operators initialize randomly all fish; while stop criterion is not met do for each fish do individual movement + evaluate fitness function; feeding operator; for each fish do instinctive movement; Calculate barycentre; for each fish do volitive movement; evaluate fitness function; update step ind

Motivation Basic Structure Operators Operators (1) Individual Movement In each iteration, each fish randomly chooses a new position This position is determined by adding to each dimension j of the current position x a random number multiplied by a predetermined step (step ind ) n j (t) = x j (t) + randu( 1,1) step ind (randu( 1, 1) is a random number from a uniform distribution in interval [ 1, 1]) The parameter step ind decreases linearly during the iterations step ind (t + 1) = step ind (t) step ind initial step ind final number of iterations

Motivation Basic Structure Operators Operators (1) Individual Movement In each iteration, each fish randomly chooses a new position This position is determined by adding to each dimension j of the current position x a random number multiplied by a predetermined step (step ind ) n j (t) = x j (t) + randu( 1,1) step ind (randu( 1, 1) is a random number from a uniform distribution in interval [ 1, 1]) The parameter step ind decreases linearly during the iterations step ind (t + 1) = step ind (t) step ind initial step ind final number of iterations

Motivation Basic Structure Operators Operators (1) Individual Movement cont d Remember (1) from before: n j (t) = x j (t) + randu( 1,1) step ind The movement only occurs if the new position n has a better fitness than the current position x, and if n lies within the aquarium boundaries Fitness difference ( f ) and displacement ( x) are evaluated according to f = f ( n) f ( x) x = n x If no individual movement occurs f = 0 and x = 0

Motivation Basic Structure Operators Operators (1) Individual Movement cont d 600 Individual movement 400 200 0 200 400 600 600 400 200 0 200 400 600 Before movement After individual movement

FSS Pseudo Code Basic Fish School Search Motivation Basic Structure Operators initialize randomly all fish; while stop criterion is not met do for each fish do individual movement + evaluate fitness function; feeding operator; for each fish do instinctive movement; Calculate barycentre; for each fish do volitive movement; evaluate fitness function; update step ind

Operators (2) Feeding Motivation Basic Structure Operators Fish can increase their weight deping on the success of the individual movement according to - w i (t) is the weight of fish i w i (t + 1) = w i (t) + f (i) max( f ) - f (i) is the difference of the fitness at current and new location - max( f ) is the maximum f of all fish An additional parameter w scale limits the weight of a fish (1 <= w i <= w scale )

Operators (2) Feeding Motivation Basic Structure Operators Fish can increase their weight deping on the success of the individual movement according to - w i (t) is the weight of fish i w i (t + 1) = w i (t) + f (i) max( f ) - f (i) is the difference of the fitness at current and new location - max( f ) is the maximum f of all fish An additional parameter w scale limits the weight of a fish (1 <= w i <= w scale )

FSS Pseudo Code Basic Fish School Search Motivation Basic Structure Operators initialize randomly all fish; while stop criterion is not met do for each fish do individual movement + evaluate fitness function; feeding operator; for each fish do instinctive movement; Calculate barycentre; for each fish do volitive movement; evaluate fitness function; update step ind

Motivation Basic Structure Operators Operators (3) Collective Instinctive Movement At first, a weighted average of individual movements m(t) based on success of individual movement of all fish is computed All fish that successfully performed individual movements ( x!= 0) influence resulting direction of the school movement The resulting direction m(t) is evaluated by m(t) = N i=1 x i f i N i=1 f i Finally, all fish update their positions according to m(t) x i (t + 1) = x i (t) + m(t)

Motivation Basic Structure Operators Operators (3) Collective Instinctive Movement At first, a weighted average of individual movements m(t) based on success of individual movement of all fish is computed All fish that successfully performed individual movements ( x!= 0) influence resulting direction of the school movement The resulting direction m(t) is evaluated by m(t) = N i=1 x i f i N i=1 f i Finally, all fish update their positions according to m(t) x i (t + 1) = x i (t) + m(t)

Motivation Basic Structure Operators Operators (3) Collective Instinctive Movement cont d 600 Collective instinctive movement 400 200 0 200 400 600 600 400 200 0 200 400 600 Before movement After collective instinctive movement

FSS Pseudo Code Basic Fish School Search Motivation Basic Structure Operators initialize randomly all fish; while stop criterion is not met do for each fish do individual movement + evaluate fitness function; feeding operator; for each fish do instinctive movement; Calculate barycentre; for each fish do volitive movement; evaluate fitness function; update step ind

Motivation Basic Structure Operators Operators (4) Collective Volitive Movement cont d Deping on the overall success rate of the whole school of fish, this movement is either a... contraction of the swarm towards the barycenter of all fish, or dilation of the swarm away from the barycenter Contradiction Dilation Contraction Dilation

Motivation Basic Structure Operators Operators (4) Collective Volitive Movement cont d If the overall weight increased after the individual movement step, the radius of the fish school is contracted in order to increase the exploitation ability If the overall weight decreased after the individual movement step, the radius of the fish school is dilated in order to cover a bigger area of the search space Barycenter (center of mass / gravity) Can be calculated based on the location x i and weight w i of each fish b(t) = N i=1 x iw i (t) N i=1 w i(t)

Motivation Basic Structure Operators Operators (4) Collective Volitive Movement cont d If the overall weight increased after the individual movement step, the radius of the fish school is contracted in order to increase the exploitation ability If the overall weight decreased after the individual movement step, the radius of the fish school is dilated in order to cover a bigger area of the search space Barycenter (center of mass / gravity) Can be calculated based on the location x i and weight w i of each fish b(t) = N i=1 x iw i (t) N i=1 w i(t)

Motivation Basic Structure Operators Operators (4) Collective Volitive Movement cont d When the total weight of the school increased in the current iteration, all fish must update their location according to ( x(t) x(t + 1) = x(t) step vol randu(0,1) b(t)) distance( x(t), b(t)) When the total weight decreased in the current iteration the update is ( x(t) x(t + 1) = x(t) + step vol randu(0,1) b(t)) distance( x(t), b(t)) distance() is a function which returns the Euclidean distance between x and b step vol is a predetermined step used to control the displacement from/to the barycenter (step vol = 2 step ind is a good choice)

Motivation Basic Structure Operators Operators (4) Collective Volitive Movement cont d When the total weight of the school increased in the current iteration, all fish must update their location according to ( x(t) x(t + 1) = x(t) step vol randu(0,1) b(t)) distance( x(t), b(t)) When the total weight decreased in the current iteration the update is ( x(t) x(t + 1) = x(t) + step vol randu(0,1) b(t)) distance( x(t), b(t)) distance() is a function which returns the Euclidean distance between x and b step vol is a predetermined step used to control the displacement from/to the barycenter (step vol = 2 step ind is a good choice)

Motivation Basic Structure Operators Operators (4) Collective Volitive Movement cont d When the total weight of the school increased in the current iteration, all fish must update their location according to ( x(t) x(t + 1) = x(t) step vol randu(0,1) b(t)) distance( x(t), b(t)) When the total weight decreased in the current iteration the update is ( x(t) x(t + 1) = x(t) + step vol randu(0,1) b(t)) distance( x(t), b(t)) distance() is a function which returns the Euclidean distance between x and b step vol is a predetermined step used to control the displacement from/to the barycenter (step vol = 2 step ind is a good choice)

Motivation Basic Structure Operators Operators (4) Collective Volitive Movement cont d 600 Collective volitive movement 400 200 0 200 400 600 600 400 200 0 200 400 600 Before movement After individual movement Barycenter

Outline Basic Fish School Search Weight Non-linear Step Size Decrease Strategy Combined Strategy 1 Basic Fish School Search 2 Weight Non-linear Step Size Decrease Strategy Combined Strategy 3

New Weight Non-linear Step Size Decrease Strategy Combined Strategy Contribution of this work 1 : Investigation and comparison of different newly developed update strategies for FSS Weight update strategies: aim at adjusting weight of each fish in each iteration (S1, S2) Step size parameter update strategy: non-linear update to the step size parameters step ind and step vol (S3) Combined strategy: combination of S2, S3, and an additional parameter (S4) 1 Andreas G.K. Janecek and Ying Tan. Feeding the Fish - Weight for the Fish School Search Algorithm. In ICSI 2011: Second International Conference on Swarm Intelligence, pages 553 562. Springer LNCS 6729, 2011

Outline Basic Fish School Search Weight Non-linear Step Size Decrease Strategy Combined Strategy Weight

Weight Non-linear Step Size Decrease Strategy Combined Strategy FSS Pseudo Code Operators Involving Weight Updates initialize randomly all fish; while stop criterion is not met do for each fish do individual movement + evaluate fitness function; feeding operator; for each fish do instinctive movement; Calculate barycentre; for each fish do volitive movement; evaluate fitness function; update step ind

Weight Non-linear Step Size Decrease Strategy Combined Strategy Strategy S1 Linear Decrease of Weights Strategy S1 Weights of all fish are decreased linearly in each iteration Decrease by pre-defined factor lin After the weight update in Eqn. (1) the weight of all fish is reduced by w i = w i lin All weights smaller than 1 are set to 1 Weight update in feeding operator: w i (t + 1) = w i (t) + f (i) max( f ) (1)

Weight Non-linear Step Size Decrease Strategy Combined Strategy Strategy S2 Fitness Based Decrease of Weights Strategy S2 Fish in poor regions will loose weight more quickly Let f ( x) be a vector containing the fitness values of all fish Weight of fish will be decreased by f fit based = normalize(f ( x)) Normalize() is a function that scales f ( x) in the range [0,1] Set w i = w i f fit based, w i < 1 : w i = 1 Scaling in order to improve results f fit based needs to be scaled by a constant c fit (between 3 and 5) f fit based = ( f fit based. 2 )/c fit

Weight Non-linear Step Size Decrease Strategy Combined Strategy Strategy S2 Fitness Based Decrease of Weights Strategy S2 Fish in poor regions will loose weight more quickly Let f ( x) be a vector containing the fitness values of all fish Weight of fish will be decreased by f fit based = normalize(f ( x)) Normalize() is a function that scales f ( x) in the range [0,1] Set w i = w i f fit based, w i < 1 : w i = 1 Scaling in order to improve results f fit based needs to be scaled by a constant c fit (between 3 and 5) f fit based = ( f fit based. 2 )/c fit

Outline Basic Fish School Search Weight Non-linear Step Size Decrease Strategy Combined Strategy Non-linear Step Size Decrease Strategy

Weight Non-linear Step Size Decrease Strategy Combined Strategy FSS Pseudo Code Operators Involving Step Size Parameters initialize randomly all fish; while stop criterion is not met do for each fish do individual movement + evaluate fitness function; feeding operator; for each fish do instinctive movement; Calculate barycentre; for each fish do volitive movement; evaluate fitness function; update step ind

Weight Non-linear Step Size Decrease Strategy Combined Strategy Revision: step ind and step vol - Individual movement: n j (t) = x j (t) + randu( 1,1) step ind - Collective-volitive movement: x(t + 1) = x(t) step vol randu(0,1) - Linear decrease of step ind : ( x(t) b(t)) distance( x(t), b(t)) step ind (t + 1) = step ind (t) step ind initial step ind final number of iterations

Weight Non-linear Step Size Decrease Strategy Combined Strategy S3 Non-linear Decrease of step ind and step vol Strategy S3 Non-linear decrease of the step size parameters Based on the shape of an ellipse Algorithm is forced to converge earlier to the (ideally global) minimum Area around the optimum can be searched in more detail step vol vs. step ind Remember that step vol = 2 step ind

Weight Non-linear Step Size Decrease Strategy Combined Strategy S3 Non-linear Decrease of step ind and step vol Strategy S3 Non-linear decrease of the step size parameters Based on the shape of an ellipse Algorithm is forced to converge earlier to the (ideally global) minimum Area around the optimum can be searched in more detail step vol vs. step ind Remember that step vol = 2 step ind

Weight Non-linear Step Size Decrease Strategy Combined Strategy S3 Non-linear Decrease of step ind and step vol Behavior of step_ind % of search space amplitude C b 10 % A a a B c b 0.001 % D 0 2500 5000 Iterations Figure: Linear and non-linear decrease of step ind and step vol - Bold red curve: new non-linear step size parameter step ind nonlin - Dotted line ( c ): linear decrease of step ind

Weight Non-linear Step Size Decrease Strategy Combined Strategy S3 Non-linear Decrease of step ind and step vol Let a be the maximum number of iterations Let b be the distance between step ind initial and step ind final Let t be the number of the current iteration In each iteration step ind nonlin (t) is calculated by step ind nonlin (t) = step ind initial sqrt [(1 t 2 /a 2 ) b 2] Derived from the canonical ellipse equation x 2 /a 2 + y 2 /b 2 = 1 x is replaced with t, y is replaced with step stepind nonlin (t)

Weight Non-linear Step Size Decrease Strategy Combined Strategy S3 Non-linear Decrease of step ind and step vol Behavior of step_ind % of search space amplitude C b 10 % A a a B c b 0.001 % D 0 2500 5000 Iterations Basic FSS: step ind (t + 1) = step ind (t) step ind initial step ind final number of iterations Non-linear: step nonlinear (t) = step ind initial sqrt [ (1 t 2 /a 2 ) b 2]

Outline Basic Fish School Search Weight Non-linear Step Size Decrease Strategy Combined Strategy Combined Strategy

Weight Non-linear Step Size Decrease Strategy Combined Strategy S4 Combination of S2, S3 and a Dilation Multiplier Strategy S4 Combines S2, S3 and newly introduced dilation multiplier c dil Allows to cover a bigger area of the search space when a dilation occurs in the collective volitive movement (i.e. when the total weight of the school decreased in the current iteration) The general idea behind the dilation multiplier is to help the algorithm to jump out of local minima S2 and S3 are applied in every iteration In case of dilation all weights are reset to their initial weight

Weight Non-linear Step Size Decrease Strategy Combined Strategy S4 Combination of S2, S3 and a Dilation Multiplier while stop criterion is not met do apply basic FSS operators including S2 and S3; %contraction of dilation? if (w(t) > w(t 1)) then %contraction: else standard collective-volitive movem. %in case of dilation apply the following movement. w(t) = 1; x(t + 1) = x(t) + c dil step vol randu(0,1) ( x(t) b(t)) distance( x(t), b(t))

Outline Basic Fish School Search Fitness per Iteration Final Results 1 Basic Fish School Search 2 3 Fitness per Iteration Final Results

Outline Basic Fish School Search Fitness per Iteration Final Results Fitness per Iteration

Fitness per Iteration Final Results Strategy S1 Linear Decrease of Weights mean(weight) 300 250 200 150 100 50 Avg. weight per iteration basic FSS lin 0.0250 lin 0.0500 lin 0.0750 f(x) 8 6 4 2 Griewan 0 0 0 1000 2000 3000 4000 5000 Iterations Figure: Average (mean) Rastrigin weight function of all fish per iteration 200 10000 0 1000 2000 Iter Rosenbroc Weights are 150 decreased linearly: w i = w i lin lin ranges from 0.0125 to 0.0750 f(x) 100 Abbreviated as lin 0.0XXX in the figure 50 basic FSS lin 0.0250 lin 0.0500 Andreas lin Janecek 0.0750 Feeding the Fish ICSI 2011 f(x) 8000 6000 4000 2000 basic FSS lin 0.0250 lin 0.0500 lin 0.0750

Fitness per Iteration Final Results Strategy S1 Linear Decrease of Weights f(x) 10 8 6 4 Ackley function basic FSS lin 0.0250 lin 0.0500 lin 0.0750 f(x) 8 6 4 Griewank function basic FSS lin 0.0250 lin 0.0500 lin 0.0750 2 2 0 0 1000 2000 3000 4000 5000 Iterations 0 0 1000 2000 3000 4000 5000 Iterations 200 Rastrigin function 10000 Rosenbrock function 150 8000 6000 f(x) 100 50 basic FSS lin 0.0250 lin 0.0500 lin 0.0750 0 1000 2000 3000 4000 5000 Iterations f(x) 4000 2000 0 basic FSS lin 0.0250 lin 0.0500 lin 0.0750 0 1000 2000 3000 4000 5000 Iterations

Fitness per Iteration Final Results Strategy S2 Fitness based Decrease of Weights mean(weight) 300 250 200 150 100 50 Avg. weight per iteration basic FSS c fit 3 c fit 4 c fit 5 f(x) 8 6 4 2 Griewa 0 0 0 1000 2000 3000 4000 5000 Iterations Figure: Average (mean) Rastrigin weight function of all fish per iteration 200 10000 0 1000 200 It Rosenbr f fit based = normalize(f 150 ( x)) Scaling: f fit based = ( f fit based. 2 )/c fit f(x) 100 w i = w i f fit based 50 basic FSS c fit 3 c fit 4 Andreas c 5 fit Janecek Feeding the Fish ICSI 2011 f(x) 8000 6000 (abbreviated as c fit X ) 4000 2000

Fitness per Iteration Final Results Strategy S2 Fitness based Decrease of Weights f(x) 10 8 6 4 Ackley function basic FSS c fit 3 c fit 4 c fit 5 f(x) 8 6 4 Griewank function basic FSS c fit 3 c fit 4 c fit 5 2 2 0 0 1000 2000 3000 4000 5000 Iterations 0 0 1000 2000 3000 4000 5000 Iterations 200 150 Rastrigin function 10000 8000 6000 Rosenbrock function basic FSS c fit 3 c fit 4 c fit 5 f(x) 100 50 basic FSS c fit 3 c fit 4 c fit 5 f(x) 4000 2000 0 1000 2000 3000 4000 5000 Iterations 0 0 1000 2000 3000 4000 5000 Iterations

Fitness per Iteration Final Results S3 Non-linear Decrease of step ind and step vol % of search space amplitude 10 % 0.001 % f(x) 200 150 Behavior of step_ind basic FSS interpolated non linear 0 1000 2000 3000 4000 5000 Iterations Figure: Rastrigin Behavior function of step ind basic FSS interpolated non linear step nonlinear (t) is abbreviated as non-linear 6000 Interpolated : interpolation of basic FSS and non-linear 100 step interpol (t) = step ind (t) - [step ind (t) step nonlinear (t)]/24000 f(x) f(x) 8 6 4 2 0 10000 8000 Griewa 0 1000 20 I Rosenb 50 2000

Fitness per Iteration Final Results S3 Non-linear Decrease of step ind and step vol 10 8 Ackley function basic FSS interpolated non linear 8 6 Griewank function basic FSS interpolated non linear f(x) 6 4 2 f(x) 4 2 0 0 1000 2000 3000 4000 5000 Iterations 0 0 1000 2000 3000 4000 5000 Iterations 200 150 Rastrigin function basic FSS interpolated non linear 10000 8000 Rosenbrock function basic FSS interpolated non linear 6000 f(x) 100 f(x) 4000 50 2000 0 1000 2000 3000 4000 5000 Iterations 0 0 1000 2000 3000 4000 5000 Iterations

Fitness per Iteration Final Results S4 Combination of S2, S3 and a Dilation Multiplier 5 Avg. weight per iteration c dil 5 8 Griew 4 6 mean(weight) 3 2 1 f(x) 4 2 0 0 0 1000 2000 3000 4000 5000 0 1000 2 Iterations 200 10000 Comparison to basic FSS and non-linear basic from FSS strategy S3 non linear Dilation multiplier c 8000 150 dilation param is abbreviated c 4 dil as dil.mult.x c 5 dil 6000 f(x) 100 Rastrigin function Since the weight of all fish is reset to 1 if a dilation occurs, the average (mean) weight per iteration is relatively low f(x) 4000 Rosenb 50 2000

Fitness per Iteration Final Results S4 Combination of S2, S3 and a Dilation Multiplier f(x) 10 8 6 4 Ackley function basic FSS non linear c dil 4 c dil 5 f(x) 8 6 4 Griewank function basic FSS non linear c dil 4 c dil 5 2 2 0 0 1000 2000 3000 4000 5000 Iterations 0 0 1000 2000 3000 4000 5000 Iterations 200 150 Rastrigin function basic FSS non linear c dil 4 c dil 5 10000 8000 6000 Rosenbrock function basic FSS non linear c dil 4 c dil 5 f(x) 100 f(x) 4000 50 2000 0 1000 2000 3000 4000 5000 Iterations 0 0 1000 2000 3000 4000 5000 Iterations

Outline Basic Fish School Search Fitness per Iteration Final Results Final Results

Fitness per Iteration Final Results Final Fitness after given Number of Iterations Table: Mean value and standard deviation (in small font under mean vaule) for 15 trials after 5 000 iterations for the five benchmarks functions Function basic FSS F Ackley ( x) 0.0100 0.0019 F Griewank ( x) 0.0233 0.0098 F Rastrigin ( x) 67.126 15.834 F Rosenb. ( x) 27.574 1.2501 S1 S2 S3 S4 0.0100 0.0023 0.0172 0.0061 36.879 8.0181 28.498 1.3876 0.1270 0.0043 0.7501 0.1393 30.745 11.801 26.277 2.3244 0.0007 6.7e-05 0.0058 0.0048 70.443 19.465 22.775 2.5801 0.0007 5.3e-05 3.2e-05 5.7e-06 48.156 13.780 23.718 2.5353 Runtime 1.0 1.002 1.004 1.014 1.024

Fitness per Iteration Final Results Final Fitness after given Number of Iterations Table: Mean value and standard deviation (in small font under mean vaule) for 15 trials after 5 000 iterations for the five benchmarks functions Function basic FSS F Ackley ( x) 0.0100 0.0019 F Griewank ( x) 0.0233 0.0098 F Rastrigin ( x) 67.126 15.834 F Rosenb. ( x) 27.574 1.2501 S1 S2 S3 S4 0.0100 0.0023 0.0172 0.0061 36.879 8.0181 28.498 1.3876 0.1270 0.0043 0.7501 0.1393 30.745 11.801 26.277 2.3244 0.0007 6.7e-05 0.0058 0.0048 70.443 19.465 22.775 2.5801 0.0007 5.3e-05 3.2e-05 5.7e-06 48.156 13.780 23.718 2.5353 Runtime 1.0 1.002 1.004 1.014 1.024

Conclusion Update Steps for FSS Weight update strategies S1, S2 Linear and fitness-based decrease of weights Significant improve of final results of Rastrigin function Non-linear decrease of step size parameters S3 Force algorithm to converge earlier to the (ideally global) minimum Area around the optimum can be searched in more detail Significant improvement in terms of fitness per iteration Combined Strategy S4 S2, S3 and a dilation multiplier Allows to cover a bigger area of the search space when a dilation occurs in collective-volitive movement Overall best results achieved with this strategy

Conclusion Update Steps for FSS Weight update strategies S1, S2 Linear and fitness-based decrease of weights Significant improve of final results of Rastrigin function Non-linear decrease of step size parameters S3 Force algorithm to converge earlier to the (ideally global) minimum Area around the optimum can be searched in more detail Significant improvement in terms of fitness per iteration Combined Strategy S4 S2, S3 and a dilation multiplier Allows to cover a bigger area of the search space when a dilation occurs in collective-volitive movement Overall best results achieved with this strategy

Conclusion Update Steps for FSS Weight update strategies S1, S2 Linear and fitness-based decrease of weights Significant improve of final results of Rastrigin function Non-linear decrease of step size parameters S3 Force algorithm to converge earlier to the (ideally global) minimum Area around the optimum can be searched in more detail Significant improvement in terms of fitness per iteration Combined Strategy S4 S2, S3 and a dilation multiplier Allows to cover a bigger area of the search space when a dilation occurs in collective-volitive movement Overall best results achieved with this strategy

Further References For more information about Fish School Search please visit the official Fish School Search web site by Prof. C. Bastos Filho and Prof. F. Lima Neto at the University of Pernambuco http://www.fbln.pro.br/fss Other research groups around the world working on FSS can also be found at this web site http://www.fbln.pro.br/fss/links.htm Implementations/Versions: Implementations of FSS in different programming languages can be downloaded from http://www.fbln.pro.br/fss/versions.htm

andreas.janecek@univie.ac.at Complete paper: Andreas G.K. Janecek and Ying Tan. Feeding the Fish - Weight for the Fish School Search Algorithm. In ICSI 2011: Second International Conference on Swarm Intelligence, pages 553 562. Springer LNCS 6729, 2011 Download link: http://www.cil.pku.edu.cn/publications/papers/2011/icsi2011andreas2.pdf