Introduction to Design Optimization: Search Methods

Similar documents
Introduction to Design Optimization: Search Methods

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

Algorithms & Complexity

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Machine Learning for Software Engineering

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

Algorithm Design (4) Metaheuristics

SIMULATED ANNEALING TECHNIQUES AND OVERVIEW. Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA

Chapter 14 Global Search Algorithms

Artificial Intelligence

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Global Optimization. for practical engineering applications. Harry Lee 4/9/2018 CEE 696

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II

GENETIC ALGORITHM with Hands-On exercise

Beyond Classical Search

x n+1 = x n f(x n) f (x n ), (1)

Heuristic Optimisation

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

Ar#ficial)Intelligence!!

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search

Mutations for Permutations

Beyond Classical Search: Local Search. CMPSCI 383 September 23, 2011

3.6.2 Generating admissible heuristics from relaxed problems

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

CMU-Q Lecture 8: Optimization I: Optimization for CSP Local Search. Teacher: Gianni A. Di Caro

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

DERIVATIVE-FREE OPTIMIZATION

Neural Network Weight Selection Using Genetic Algorithms

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

AI Programming CS S-08 Local Search / Genetic Algorithms

The k-means Algorithm and Genetic Algorithm

Beyond Classical Search

Introduction to Evolutionary Computation

GAMES Webinar: Rendering Tutorial 2. Monte Carlo Methods. Shuang Zhao

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

Genetic Algorithms. Genetic Algorithms

ABSTRACT. problems including time scheduling, frequency assignment, register allocation, and

1. Practice the use of the C ++ repetition constructs of for, while, and do-while. 2. Use computer-generated random numbers.

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

10703 Deep Reinforcement Learning and Control

Dr. Stephan Steigele Vorlesung im SS 2008

Multi-Objective Optimization Using Genetic Algorithms

Simulated annealing/metropolis and genetic optimization

Genetic Algorithms for Vision and Pattern Recognition

Modern Methods of Data Analysis - WS 07/08

Introduction to Optimization

APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING

The Genetic Algorithm for finding the maxima of single-variable functions

LINEAR AND NONLINEAR IMAGE RESTORATION METHODS FOR EDDY

The Plan: Basic statistics: Random and pseudorandom numbers and their generation: Chapter 16.

Scientific Computing with Case Studies SIAM Press, Lecture Notes for Unit IV Monte Carlo

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

Dr.-Ing. Johannes Will CAD-FEM GmbH/DYNARDO GmbH dynamic software & engineering GmbH

Genetic Algorithms Variations and Implementation Issues

Genetic Placement: Genie Algorithm Way Sern Shong ECE556 Final Project Fall 2004

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Non-deterministic Search techniques. Emma Hart

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Introduction to Optimization

Evolutionary Computation. Chao Lan

Escaping Local Optima: Genetic Algorithm

Artificial Intelligence Application (Genetic Algorithm)

Comparison of TSP Algorithms

CS:4420 Artificial Intelligence

TDDC17. Intuitions behind heuristic search. Best-First Search. Recall Uniform-Cost Search. f(n) =... + h(n) g(n) = cost of path from root node to n

Outline. Informed Search. Recall: Uninformed Search. An Idea. Heuristics Informed search techniques More on heuristics Iterative improvement

AN IMPROVED ITERATIVE METHOD FOR SOLVING GENERAL SYSTEM OF EQUATIONS VIA GENETIC ALGORITHMS

Genetic Algorithm for FPGA Placement

TDDC17. Intuitions behind heuristic search. Recall Uniform-Cost Search. Best-First Search. f(n) =... + h(n) g(n) = cost of path from root node to n

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Optimization Techniques for Design Space Exploration

Random Search Report An objective look at random search performance for 4 problem sets

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

Theoretical Concepts of Machine Learning

Global optimisation techniques in water resources management

BEYOND CLASSICAL SEARCH

Computational Methods. Randomness and Monte Carlo Methods

Derivative-Free Optimization

Classical Gradient Methods

Artificial Intelligence

CS5401 FS2015 Exam 1 Key

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

Evolutionary Algorithms: Perfecting the Art of Good Enough. Liz Sander

Optimization in Brachytherapy. Gary A. Ezzell, Ph.D. Mayo Clinic Scottsdale

Today s class. Roots of equation Finish up incremental search Open methods. Numerical Methods, Fall 2011 Lecture 5. Prof. Jinbo Bi CSE, UConn

Optimization of Benchmark Functions Using Genetic Algorithm

Local Search (Ch )

OPTIMIZATION METHODS. For more information visit: or send an to:

Chapter 9: Genetic Algorithms

Genetic Algorithms. Chapter 3

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm

Transcription:

Introduction to Design Optimization: Search Methods

1-D Optimization The Search We don t know the curve. Given α, we can calculate f(α). By inspecting some points, we try to find the approximated shape of the curve, and to find the minimum, α* numerically, using as few function evaluation as possible. The procedure can be divided into two parts: a) Finding the range or region known to contain α*. b) Calculating the value of α* as accurately as designed or as possible within the range narrowing down the range.

N-Dimensional Search The Problem now has N Design Variables. Solving the Multiple Design Variable Optimization (Minimization) Problem Using the 1-D Search Methods This is carried out by: To choose a direction of search o To deal with one variable each time, in sequential order - easy, but take a long time (e.g. x 1, x 2,, x N ) o To introduce a new variable/direction that changes all variables simultaneously, more complex, but quicker (e.g. S) Then to decide how far to go in the search direction (small step ε = Δx, or large step determining α by 1D search)

Search Methods Typical approaches include: Quadratic Interpolation (Interpolation Based) Cubic Interpolation Newton-Raphson Scheme (Derivative Based) Fibonacei Search (Pattern Search Based) Guided Random Random Iterative Optimization Process: Start point α o OPTIMIZATION Estimated point α k New start point α k+1 Repeat this process until the stopping rules are satisfied, then α* =α n.

N-D Search Methods Calculus Based Guided Random Random Enumerative Cubic, Gradient Based, Newton-Raphson........ Genetic Algorithm Simulated Annealing Monte Carlo

Calculus Based N-D Search Methods Indirect method: knowing the objective function set the gradient to Zero. If we need to treat the function as a black box we cannot do this. We only know F(X) at the point we evaluate the function. Direct Methods: Steepest Descent method Different flavors of Newton methods Guided random search-combinatory techniques Genetic method Simulated annealing Random: Monte Carlo Enumerative method: scan the whole domain. This is simple but time consuming

Steepest Descent or Gradient Descent The gradient of a scalar field is a vector field which points in the direction of the greatest rate of increase of the scalar field, and whose magnitude is the greatest rate of change. This means that if we move in its negative direction we should go downhill and find a minimum. This is the same path a river would follow. Given a point in the domain the next point is chosen as it follows: isoline isoline X 3 X 2 isoline X 1 isoline X 0 Gradient always normal to isoline

Steepest descent and ill-conditioned, badly scaled functions Gradient descent has problems with ill-conditioned functions such as the Rosenbrock function shown here. The function has a narrow curved valley which contains the minimum. The bottom of the valley is very flat. Because of the curved flat valley the optimization is zig-zagging slowly with small stepsizes towards the minimum.

Newton-Raphson Method The Newton-Raphson method is defined by the recurring relation: An illustration of one iteration of Newton's method (the function ƒ is shown in blue and the tangent line is in red). We see that x n+1 is a better approximation than x n for the root x of the function f.

Secant Method The Secant method is defined by the recurrence relation: The first two iterations of the secant method. The red curve shows the function f and the blue lines are the secants.

Quadratic Interpolation Method 2 f ( α) H( α) = a+ bα + cα H(α) f(α) Quadratic Interpolation uses a quadratic function, H(α), to approximate the unknown objective function, f(α). Parameters of the quadratic function are determined by several points of the objective function, f(α). The known optimum of the interpolation quadratic function is used to provide an estimated optimum of the objective function through an iterative process. The estimated optimum approaches the true optimum. The method requires proper range being found before started. It is relatively efficient, but sensitive to the shape of the objective

Combinatory Search: Genetic Algorithm Valid for discrete variables One of the best all purposes search method. Emulates the genetic evolution due to the survival of the fittest Each variable value >a GENE, a binary string value in the variable range Vector variables X> a CHROMOSOME, a concatenation of a random combinations of Genes (strings) one per type (one value per variable). A Chromosome (X i ) is a point in the X domain and is also defined as genotype. Objective Function F(X)>phenotype. F(X i ) is a point in the Objective Function domain corresponding to X i.

Genetic Algorithm Construction of a chromosome X i (x i,y i,z i ) x 1 y 1 z 1 X 1

Genetic Algorithm 1) Construct a population of genotypes (chromosomes) and evaluate the phenotypes (objective function). 2) Associate a fitness value between 0 and 1 to each phenotype with a fitness function. This function normalizes the phenotype (objective function) and assigns to its genotype an estimate (between 0 and 1) of its ability to survive. 3) Reproduction. The ability of a genotype to reproduce is a probabilistic law biased by the value given by the fitness function. Reproduction is done as it follows: Given 2 candidate for reproduction, we have: a) Cloning. The offspring is the same as the parents b) Crossover. Chromosomes are split in two (head and tail) at a random point between genes and rejoined swapping the tails. When crossover is performed Mutation takes place. Each Gene is slightly changed to explore more possible outcomes. 4) Convergence. The algorithm stops when all genes in all individuals are at 95% percentile

Genetic Algorithm. Example

Genetic Algorithm. Example

Genetic Algorithm. Example Results Outcome:

Simulated Annealing (wikipedia) The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. In the simulated annealing (SA) method, each point s of the search space is analogous to a state of some physical system, and the function E(s) to be minimized is analogous to the internal energy of the system in that state. The goal is to bring the system, from an arbitrary initial state, to a state with the minimum possible energy.

By analogy with this physical process, each step of the SA algorithm attempts to replace the current solution by a random solution (chosen according to a candidate distribution, often constructed to sample from solutions near the current solution). The new solution may then be accepted with a probability that depends both on the difference between the corresponding function values and also on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the choice between the previous and current solution is almost random when T is large, but increasingly selects the better or "downhill" solution (for a minimization problem) as T goes to zero. The allowance for "uphill" moves potentially saves the method from becoming stuck at local optima.

Monte Carlo Method Monte Carlo methods vary, but tend to follow a particular pattern: Define a domain of possible inputs. Generate inputs randomly from a probability distribution over the domain. Perform a deterministic computation on the inputs. Aggregate the results.

For example, consider a circle inscribed in a unit square. Given that the circle and the square have a ratio of areas that is π/4, the value of π can be approximated using a Monte Carlo method: Draw a square on the ground, then inscribe a circle within it. Uniformly scatter some objects of uniform size (grains of rice or sand) over the square. Count the number of objects inside the circle and the total number of objects. The ratio of the two counts is an estimate of the ratio of the two areas, which is π/4. Multiply the result by 4 to estimate π. In this procedure the domain of inputs is the square that circumscribes our circle. We generate random inputs by scattering grains over the square then perform a computation on each input (test whether it falls within the circle). Finally, we aggregate the results to obtain our final result, the approximation of π.

To get an accurate approximation for π this procedure should have two other common properties of Monte Carlo methods. First, the inputs should truly be random. If grains are purposefully dropped into only the center of the circle, they will not be uniformly distributed, and so our approximation will be poor. Second, there should be a large number of inputs. The approximation will generally be poor if only a few grains are randomly dropped into the whole square. On average, the approximation improves as more grains are dropped.