An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

Similar documents
Global optimisation techniques in water resources management

Investigation of global optimum seeking methods in water resources problems

Non-deterministic Search techniques. Emma Hart

An Evolutionary Algorithm for Minimizing Multimodal Functions

Simplex of Nelder & Mead Algorithm

Chapter 14 Global Search Algorithms

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Introduction to Design Optimization: Search Methods

SIMULATED ANNEALING TECHNIQUES AND OVERVIEW. Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Algorithm Design (4) Metaheuristics

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

x n+1 = x n f(x n) f (x n ), (1)

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Ar#ficial)Intelligence!!

Heuristic Optimisation

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Optimization Techniques for Design Space Exploration

Introduction to Stochastic Optimization Methods (meta-heuristics) Modern optimization methods 1

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.

The Genetic Algorithm for finding the maxima of single-variable functions

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Artificial Intelligence

Machine Learning for Software Engineering

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

Algorithms & Complexity

Beyond Classical Search: Local Search. CMPSCI 383 September 23, 2011

Escaping Local Optima: Genetic Algorithm

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

Monte Carlo for Optimization

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Surrogate-enhanced evolutionary annealing simplex algorithm. for effective and efficient optimization of water resources. problems on a budget


ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

PARALLELIZATION OF THE NELDER-MEAD SIMPLEX ALGORITHM

5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing. 6. Meta-heuristic Algorithms and Rectangular Packing

Artificial Intelligence

HAM: A HYBRID ALGORITHM PREDICTION BASED MODEL FOR ELECTRICITY DEMAND

Genetic Algorithms Variations and Implementation Issues

Gradient Descent. 1) S! initial state 2) Repeat: Similar to: - hill climbing with h - gradient descent over continuous space

Introduction to Optimization Using Metaheuristics. The Lecturer: Thomas Stidsen. Outline. Name: Thomas Stidsen: Nationality: Danish.

Reducing Graphic Conflict In Scale Reduced Maps Using A Genetic Algorithm

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Global Optimization. for practical engineering applications. Harry Lee 4/9/2018 CEE 696

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Simple mechanisms for escaping from local optima:

BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines

A new Optimization Algorithm for the Design of Integrated Circuits

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

Theoretical Concepts of Machine Learning

Introduction to Optimization

Optimization in Brachytherapy. Gary A. Ezzell, Ph.D. Mayo Clinic Scottsdale

OPTIMIZATION METHODS. For more information visit: or send an to:

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Introduction to Design Optimization: Search Methods

Exploration vs. Exploitation in Differential Evolution

3.6.2 Generating admissible heuristics from relaxed problems

Heuristic Optimization Introduction and Simple Heuristics

LINEAR AND NONLINEAR IMAGE RESTORATION METHODS FOR EDDY

Simulated Annealing. Slides based on lecture by Van Larhoven

A motivated definition of exploitation and exploration

Simulated Annealing. G5BAIM: Artificial Intelligence Methods. Graham Kendall. 15 Feb 09 1

Introduction to Optimization

Informed search algorithms. (Based on slides by Oren Etzioni, Stuart Russell)

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

Fast oriented bounding box optimization on the rotation group SO(3, R)

Algorithms for convex optimization

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem

The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination

Heuristic Optimisation

Classical Gradient Methods

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Grid-Based Genetic Algorithm Approach to Colour Image Segmentation

Research Incubator: Combinatorial Optimization. Dr. Lixin Tao December 9, 2003

DERIVATIVE-FREE OPTIMIZATION

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

CS:4420 Artificial Intelligence

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

An Introduction to Evolutionary Algorithms

Simplicial Global Optimization

GENETIC ALGORITHM with Hands-On exercise

Algoritmi di Ottimizzazione: Parte A Aspetti generali e algoritmi classici

Engineering design using genetic algorithms

Hardware-Software Codesign

Beyond Classical Search

Simultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Network and Fuzzy Simulation

Note: In physical process (e.g., annealing of metals), perfect ground states are achieved by very slow lowering of temperature.

Artificial Intelligence Application (Genetic Algorithm)

COMPARATIVE STUDY OF CIRCUIT PARTITIONING ALGORITHMS

Learning Adaptive Parameters with Restricted Genetic Optimization Method

Introduction to Optimization Using Metaheuristics. Thomas J. K. Stidsen

Transcription:

FIFTH INTERNATIONAL CONFERENCE ON HYDROINFORMATICS 1-5 July 2002, Cardiff, UK C05 - Evolutionary algorithms in hydroinformatics An evolutionary annealing-simplex algorithm for global optimisation of water resource systems Andreas Efstratiadis and Demetris Koutsoyiannis Department of Water Resources National Technical University of Athens Parts of the presentation 1. Outline of the global optimisation problem 2. Overview of global optimisation techniques 3. The evolutionary annealing-simplex algorithm 4. Evaluation of the algorithm Mathematical applications Real-world applications 5. Conclusions A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 2

Posing the problem The global optimisation problem Find a real vector x *, defined in the n-dimensional continuous space D = [a, b] R n, that minimises a real, nonlinear function f, i.e.: f(x * ) = min f(x), a < x < b Main assumptions 1. The objective function is non-convex Due to non-convexity, the search space is rough and multimodal 2. No external constraints are imposed to the problem The mathematical constraints are handled either through penalty methods or via simulation 3. The analytical expression of the objective function is unknown Apparently, the analytical expression of the partial derivatives is also unknown A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 3 Troubles encountered 1. Convergence to a local optimum Generally, it is relatively easy to locate a local optimum, but very difficult or even impossible to get out of it 2. Extremely large number of trials to locate the global optimum To avoid getting trapped by local optima, a detailed exploration of the search space may be required 3. The curse of dimensionality The theoretical time to solve a nonlinear problem increases even exponentially with its dimension 4. The practical aspect of real-world applications In real-world problems, a highly accurate solution is neither possible, because of uncertainties and inaccuracies in the underlying model or data, nor feasible, due to the unacceptable high computational effort required to attain it when the function evaluation is time consuming A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 4

Local vs. global optimisation Evolving pattern Transition rules Location of the global optimum Typical categories of algorithms Local methods Single point (usually) Deterministic (subsequent line minimisations) Not guaranteed Gradient-based methods (steepest descend, conjugate gradient, Newton, quasi-newton) Direct search methods (downhill simplex, rotating directions) Global methods Random sample (population) of points Deterministic and stochastic Asympotically guaranteed Set covering, pure, adaptive & controlled random search, twophase algorithms (multistart), evolutionary & genetic algorithms, simulated annealing, tabu search, heuristics A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 5 Effectiveness vs. efficiency Effectiveness Efficiency Definition Performance measure Examples and counterexamples Probability of locating the global optimum, starting from any random initial point (or population of points) Number of successes out of a predefined number of independent runs of the algorithm The exhaustive character of grid search methods ensures high probability of locating the global optimum; however this usually requires an extremely large number of function evaluations Convergence speed Average number of function evaluations to converge The gradient-based concept of local search methods ensures quick convergence to the nearest optimum, without guaranteeing that this is the global one A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 6

Nonlinear optimisation: The story so far 1990 1980 1970 1960 1950 The era of integration: old methods become popular again, diverse approaches are combined into heuristic schemes A variety of global optimisation methods are developed, following the expansion of artificial intelligence Direct search methods are gradually abandoned Evolutionary algorithms come to the fore Dominance of direct search (derivative-free) methods Development of primitive stochastic optimisation methods Foundation of the modern nonlinear optimisation theory Development of gradient-based algorithms Classical approach, based on methods of calculus A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 7 The downhill simplex method (*) General concept Instead of improving the objective function by calculating or approximating its gradient, the direction of improvement is selected just by comparing the function values at adjacent points, using an evolving pattern of n+1 points that span the n-dimensional space Possible simplex operations Reflection (A) through the actual worst vertex Expansion (B) or contraction (C) along the direction of reflection Shrinkage (D) towards the actual best vertex A B C D (*) Nelder, J. A., and R. Mead, A simplex method for function minimization, Computer Journal, 7(4), 308-313, 1965 A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 8

Pure random search Stochastic optimisation techniques A pre-specified number of points are randomly generated into the feasible space, and the best of them is taken as an estimator of the global optimum Adaptive random search Each next point is generated as a random perturbation around the current one, and it is accepted only if it improves the function Multistart strategy A local optimisation algorithm is run several times, starting from different, randomly selected locations (in an ideal case, we wish to start at every region of attraction of local optima) Due to their almost exclusively random character, stochastic methods attain to escape from local optima and effectively cope with rough spaces, but, on the other hand, the lack of determinism within the searching procedure leads to a very slow convergence rate A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 9 Evolutionary algorithms Inspiration Modelling the search process of natural evolution Main concepts Representation of control variables on a chromosome-like (usually binary string) format Search through a population of points, not a single point Application of genetic operators to create new generations Genetic operators The selection operator aims at improving the average genetic characteristics of the population, by providing higher probability of surviving to the better of its individuals Through the crossover operator, two parents exchange part of their characteristics, to generate more powerful offsprings The mutation operator aims at introducing new characteristics into the population, in order to enhance its diversity A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 10

The simulated annealing strategy The annealing process in thermodynamics For slowly cooled thermodymnamical systems (e.g., metals), nature is able to find the minimum energy state, while the system may end in an amorphous state, having a higher energy, if it is cooled quickly Energy minimisation strategy The system is allowed sometimes to go uphill, so as to get out of a local energy minimum in favor of finding a better, more global one Mathematical formulation The system s energy state E depends on the actual temperature T and it is distributed according to the Boltzmann probability function: Prob(E) ~ exp ( - E / k T) Simulated annealing and global optimisation The annealing strategy was transferred into optimisation algorithms by introducing a control parameter T (analogue of temperature), an annealing cooling schedule, describing the gradual reduction of T, and a stochastic criterion for accepting non-optimal (uphill) moves A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 11 The shuffled complex evolution method (*) General concept Instead of using multiple simplexes starting from random locations and working fully independently, it would be more efficient to let them both evolving individually and exchanging information (an analogous with a research project, where many scientists, even working independently or in small groups, they organise frequent meetings to discuss their progress) Description of the algorithm A random set of points (a population ) is sampled and partitioned into a number of complexes Each of the complexes is allowed to evolve in the direction of global improvement, using competitive evolution techniques that are based on the downhill simplex method At periodic stages, the entire set of points is shuffled and reassigned to new complexes, to enable information sharing (*) Duan, Q., S. Sorooshian, and V. Gupta, Effective and efficient global optimization for conceptual rainfall-runoff models, Water Resources Research, 28(4), 1015-1031, 1992 A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 12

The concept of annealing-simplex methods Simulated annealing Advantage: Capability of escaping from local optima, by accepting some uphill moves, according to probabilistic criteria (effectiveness) Disadvantage: Too slow convergence rate (the slowest the convergence, the most probable to locate the global optimum) Downhill simplex method Advantage: Quick and easy location of the local optimum, in the region of attraction of which the starting point is found (efficiency) Disadvantage: Failing of getting out of a local optimum after converging to it Bad performance in case of rough or ill-posed search spaces Incorporation of a simulated annealing strategy within a downhill simplex searching scheme A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 13 The evolutionary annealing-simplex algorithm The motivation Formulation of a probabilistic heuristic algorithm that joins concepts from different methodological approaches, in order to ensure both effectiveness and efficiency Main concepts An evolutionary searching technique is introduced The evolution is made according to a variety of combined (deterministic and stochastic) transition rules, most of them based on the downhill simplex scheme An adaptive annealing cooling schedule regulates the temperature, which determines the degree of randomness through the evolution procedure Input arguments the size of the population, m the parameters of the annealing schedule the mutation probability, p m A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 14

Flowchart of a typical iteration step Formulate a simplex S = {x 1,, x n+1 } by randomly sampling its vertices from the actual population, and assign x 1 to the best and x n+1 to the worst vertex From the subset S-{x 1 }, select a vertex to reject, w, according to the modified function g(x) = f(x) + u T (u: unit uniform number, T: actual temperature), and generate a new vertex r, by reflecting the simplex through w Accept r and try some uphill steps, to escape from the local optimum If any uphill move success, generate a random point by applying the mutation operator yes f(r) < f(w) no g(r) < g(w) no yes Reject r, reduce T by a factor λ and move downhill, either by inside contracting or by shrinking the simplex Move downhill, either by expanding or by outside contracting the simplex If a new point is better than the actual vertices, employ some subsequent line minimisations, to improve the local search speed A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 15 Further investigation (1) The modified objective function Coupling of determinism and Original function, f(x) randomness Modified function, g(x) By adding a stochastic component to the objective function (relative to the actual temperature), the algorithm behaves as between random f and downhill search 1 + u 1 T Transforming the search space The use of the modified function as the criterion to accept or to reject a newly generated point, provides more flexibility, either by smoothing a rough search space or by aggravating a flat surface Local minimum Global minimum A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 16 f 1 f 2 + u 2 T f 2

Further investigation (2) The generalised downhill simplex procedure The quasi-stochastic operator In order to increase randomness, the generation of new vertices within the reflection, expansion and contraction steps is implemented according to a transition rule of the form: x new = g + (a + b u) d where g is the centroid of the simplex, a and b are appropriate scale parameters, d is the direction of improvement, which is specified according to the original Nelder-Mead formula, and u is a unit uniform number (for u = 0.5, the generalised and the original Nelder-Mead methods become identical) A deeper insight The implementation of simplex movements, the lengths of which are randomised, prohibits recycling of the simplex to the same vertices A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 17 Further investigation (3) Escaping from local optima by hill-climbing The hill-climbing strategy According to a probabilistic criterion, a pre-specified number of subsequent uphill moves may be taken along the direction of reflection, in order to surpass the ridge and explore a neighbouring valley, where another local optimum is located The acceptance criterion To accept a new point, it just suffices to check if it is better than the previous one, which guarantees the crossing of the ridge x A Trial point 1 (rejected) r Actual local minimum Region of attraction A x 1 Reflection point Region of attraction B x 2 x 3 Neighbouring local minimum Trial point 2 (accepted) Trial point 3 (accepted) x B A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 18

The role of temperature Further investigation (4) The adaptive annealing cooling schedule At the initial stages, we prefer high values of T, in order to easily escape from local optima, whereas, after the global optimum is approached, we prefer low values to accelerate convergence The importance of an appropriate annealing schedule Very large values of T reduce drastically the efficiency of the algorithm, while very low ones reduce its effectiveness (the algorithm becomes too deterministic) The adaptive strategy To avoid extremely high temperatures, at the beginning of each iteration cycle, T is regulated according to the rule: T ξ (f max -f min ), ξ 1 On the other hand, whenever a local optimum is reached (this is recognised by the fact that the simplex volume decreases), T is slightly reduced by a factor λ (typically λ = 0.90-0.99) A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 19 Mathematical applications 8 typical benchmark functions were tested, by implementing 100 independent runs for each one The performance of 5 optimisation techniques was evaluated (in terms of effectiveness and efficiency) via Monte Carlo simulation The influence of several algorithmic parameters was examined The best performance was indicated by the SCE and the EAS methods Mean effectiveness (%) 90 80 70 60 50 40 30 20 10 0 Mean efficiency NM M-NM GA SCE EAS 1000000 100000 10000 1000 100 10 1 Average number of function evaluations Abbreviations NM: Nelder-Mead method M-NM: Multistarts (20) of the Nelder-Mead method GA: Simple, binary-coded, genetic algorithm SCE: Shuffled complex evolution method EAS: Evolutionary annealing-simplex algorithm A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 20

Real world applications: Brief description Objective Control variables Main difficulties Problem A Calibration of a simple water balance model, based on a Thornthwaite approach 6 (= 4 parameters of the model and 2 initial conditions) The conceptual character of the model, the quality of data, the interdependence between parameters, the introduction of bias, due to the use of a single measure for calibration Problem B Maximisation of the mean annual energy profit of a hypothetical system of two parallel reservoirs 384 (= the target releases from each reservoir, for a monthly simulation period of 16 years) The flat-type response surface, due to the use of desirable and not real magnitudes as control variables, which makes extremely difficult the location of the gradient A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 21 Real world applications: Main conclusions The SCE method and the evolutionary annealing-simplex algorithm faced with success both real-world problems; the latter was slightly more effective (fast), while the former was slightly more efficient The role of the population size was proved particularly crucial, affecting drastically the performance of the two algorithms Number of failures out of 100 100 90 80 70 60 50 40 30 20 10 0 1 complex 2 3 4 Evolutionary annealing-simplex Shuffled complex evolution 0 500 1000 1500 2000 2500 3000 3500 4000 Average number of function evaluations 5 6 7 8 An interesting note In the model calibration problem, for both algorithms, there exists an optimal population size (determined experimentally), for which effectiveness is maximised A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 22

Conclusive remarks 1. The current trend in global optimisation is the combination of ideas obtained from diverse methodological approaches, even the old ones 2. Heuristic schemes manage to handle the typical shortcomings of global optimisation applications, by effectively balancing determinism and randomness within the searching procedure 3. A specific characteristic of heuristic methods is the existence of a variety of input arguments that strongly affect their performance (e.g., the population size), and have to be specified by the user 4. The proposed evolutionary annealing-simplex algorithm uses as basis a Nelder-Mead scheme and improves its flexibility by introducing new types of simplex movements; moreover, the adaptive annealing schedule regulates randomness and provides more chances to escape from local optima and handle hard search spaces 5. To increase the effectiveness and efficiency of the proposed method, several improvements could be made towards the parallelisation of the algorithm, the development of criteria for automatic regulation of its input arguments and the incorporation of specific rules to handle the multi-objective nature of most hydroinformatics applications A. Efstratiadis and D. Koutsoyiannis, An evolutionary annealing-simplex algorithm for global optimisation 23