Introduction to Design Optimization: Search Methods

Similar documents
Introduction to Design Optimization: Search Methods

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Heuristic Optimisation

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

The k-means Algorithm and Genetic Algorithm

GENETIC ALGORITHM with Hands-On exercise

Introduction to Evolutionary Computation

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

Genetic Algorithms for Vision and Pattern Recognition

Artificial Intelligence Application (Genetic Algorithm)

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Mutations for Permutations

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

Beyond Classical Search: Local Search. CMPSCI 383 September 23, 2011

Genetic Algorithms. Genetic Algorithms

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.

APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING

Evolutionary Computation. Chao Lan

Advanced Search Genetic algorithm

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

Genetic Algorithms. Kang Zheng Karl Schober

Optimization. Industrial AI Lab.

Genetic Algorithms Variations and Implementation Issues

3.6.2 Generating admissible heuristics from relaxed problems

Neural Network Weight Selection Using Genetic Algorithms

Genetic Algorithms. Chapter 3

Chapter 9: Genetic Algorithms

Introduction to Genetic Algorithms. Genetic Algorithms

DERIVATIVE-FREE OPTIMIZATION

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

Modern Methods of Data Analysis - WS 07/08

The Genetic Algorithm for finding the maxima of single-variable functions

Multi-Objective Optimization Using Genetic Algorithms

LINEAR AND NONLINEAR IMAGE RESTORATION METHODS FOR EDDY

Introduction to Genetic Algorithms

SIMULATED ANNEALING TECHNIQUES AND OVERVIEW. Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Introduction to optimization methods and line search

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

Today s class. Roots of equation Finish up incremental search Open methods. Numerical Methods, Fall 2011 Lecture 5. Prof. Jinbo Bi CSE, UConn

V.Petridis, S. Kazarlis and A. Papaikonomou

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Preliminary Background Tabu Search Genetic Algorithm

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

Solving Optimization and Inverse Problems in Remote Sensing by using Evolutionary Algorithms

Optimal Synthesis of Crank Rocker Mechanism for Point to Point Path Generation

OPTIMIZATION METHODS. For more information visit: or send an to:

4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search

CMU-Q Lecture 8: Optimization I: Optimization for CSP Local Search. Teacher: Gianni A. Di Caro

Genetic Algorithm. Dr. Rajesh Kumar

Optimization. (Lectures on Numerical Analysis for Economists III) Jesús Fernández-Villaverde 1 and Pablo Guerrón 2 February 20, 2018

Optimization of Benchmark Functions Using Genetic Algorithm

Algorithm Design (4) Metaheuristics

Experience with the IMMa tyre test bench for the determination of tyre model parameters using genetic techniques

Chapter Multidimensional Gradient Method

Lecture 6: Genetic Algorithm. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

Review: Final Exam CPSC Artificial Intelligence Michael M. Richter

Time Complexity Analysis of the Genetic Algorithm Clustering Method

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

Numerical Optimization: Introduction and gradient-based methods

Evolutionary Computation Algorithms for Cryptanalysis: A Study

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

Machine Evolution. Machine Evolution. Let s look at. Machine Evolution. Machine Evolution. Machine Evolution. Machine Evolution

Beyond Classical Search

A Genetic Algorithm for Solving the Optimal Power Flow Problem

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Chapter 14 Global Search Algorithms

REGRESSION ANALYSIS : LINEAR BY MAUAJAMA FIRDAUS & TULIKA SAHA

CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM

Reducing Graphic Conflict In Scale Reduced Maps Using A Genetic Algorithm

Genetic Placement: Genie Algorithm Way Sern Shong ECE556 Final Project Fall 2004

Aero-engine PID parameters Optimization based on Adaptive Genetic Algorithm. Yinling Wang, Huacong Li

Instructor: Dr. Benjamin Thompson Lecture 15: 3 March 2009

Evolutionary Computation for Combinatorial Optimization

METAHEURISTICS Genetic Algorithm

Using Genetic Algorithms to Solve the Box Stacking Problem

Module 4 : Solving Linear Algebraic Equations Section 11 Appendix C: Steepest Descent / Gradient Search Method

Genetic Programming. Genetic Programming. Genetic Programming. Genetic Programming. Genetic Programming. Genetic Programming

Genetic Programming. Charles Chilaka. Department of Computational Science Memorial University of Newfoundland

CS:4420 Artificial Intelligence

AN IMPROVED ITERATIVE METHOD FOR SOLVING GENERAL SYSTEM OF EQUATIONS VIA GENETIC ALGORITHMS

Machine Learning for Software Engineering

Optimization of Association Rule Mining through Genetic Algorithm

Classical Gradient Methods

Abstract. 1 Introduction

25. NLP algorithms. ˆ Overview. ˆ Local methods. ˆ Constrained optimization. ˆ Global methods. ˆ Black-box methods.

5 Machine Learning Abstractions and Numerical Optimization

Thresholds Determination for Probabilistic Rough Sets with Genetic Algorithms

Applying genetic algorithm on power system stabilizer for stabilization of power system

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114

Introduction to Optimization

Transcription:

Introduction to Design Optimization: Search Methods

1-D Optimization The Search We don t know the curve. Given α, we can calculate f(α). By inspecting some points, we try to find the approximated shape of the curve, and to find the minimum, α* numerically, using as few function evaluation as possible. The procedure can be divided into two parts: a) Finding the range or region known to contain α*. b) Calculating the value of α* as accurately as designed or as possible within the range narrowing down the range.

Search Methods Typical approaches include: Quadratic Interpolation (Interpolation Based) Cubic Interpolation Newton-Raphson Scheme (Derivative Based) Fibonacei Search (Pattern Search Based) Golden Section Search Iterative Optimization Process: Start point α o OPTIMIZATION Estimated point α k New start point α k+1 Repeat this process until the stopping rules are satisfied, then α* =α n.

Iterative Process for Locating the Range Picking up a start point, α o, and a range; Shrinking the range; Doubling the range; Periodically changing the sign. Typical Stopping Rules: * * * * f ( new α ) f ( last α ) new α last α < ε or < ε * * f( newα ) new α

Quadratic Interpolation Method 2 f ( α ) H ( α ) = a+ bα + cα H(α) f(α) Quadratic Interpolation uses a quadratic function, H(α), to approximate the unknown objective function, f(α). Parameters of the quadratic function are determined by several points of the objective function, f(α). The known optimum of the interpolation quadratic function is used to provide an estimated optimum of the objective function through an iterative process. The estimated optimum approaches the true optimum. The method requires proper range being found before started. It is relatively efficient, but sensitive to the shape of the objective

H(α) H(α 1 )= f(α 1 ) f(α) α 1 α 2 α H * α 3 α H *

numerator 2 denominator - 1

Point update schemes based on the relations between the center point, α 2, f(α 2 ), and the present optimum: α *, f(α * ). 1 3 2 3 1 2 * 1 1 * 2 3 2 3

1 * 1 3 2 2 3 1 * 1 2 2 3 3

N-Dimensional Search The Problem now has N Design Variables. Solving the Multiple Design Variable Optimization (Minimization) Problem Using the 1-D Search Methods Discussed Previously This is carried out by: To choice a direction of search o To deal one variable each time, in sequential order - easy, but take a long time (e.g. x 1, x 2,, x N ) o To introduce a new variable/direction that changes all variables simultaneously, more complex, but quicker (e.g. S) Then to decide how far to go in the search direction (small step ε = Δx, or large step determining α by 1D search)

N-D Search Methods

N-D Search Methods Calculus Based Indirect method: knowing the objective function set the gradient to Zero. If we need to treat the function as a black box we cannot do this. We only know F(X) at the point we evaluate the function. Indirect Method Steepest Descent method Different flavors of Newton methods Guided random search-combinatory techniques Genetic method Enumerative method: scan the whole domain. This is simple but time consuming

Steepest Descent or Gradient Descent the gradient of a scalar field is a vector field which points in the direction of the greatest rate of increase of the scalar field, and whose magnitude is the greatest rate of change. This means that if we move in its negative direction we should go downhill and find a minimum. This is the same path a river would follow. Given a point in the domain the next point is chosen as it follows: isoline isoline X 3 X 2 isoline X 1 isoline X 0 Gradient always normal to isoline

Steepest descent and ill-conditioned functions Gradient descent has problems with ill-conditioned functions such as the Rosenbrock function shown here. The function has a narrow curved valley which contains the minimum. The bottom of the valley is very flat. Because of the curved flat valley the optimization is zig-zagging slowly with small stepsizes towards the minimum.

Newton-Raphson Method The Newton-Raphson method is defined by the recurrence relation: An illustration of one iteration of Newton's method (the function ƒ is shown in blue and the tangent line is in red). We see that x n+1 is a better approximation than x n for the root x of the function f.

Secant Method The Secant method is defined by the recurrence relation: The first two iterations of the secant method. The red curve shows the function f and the blue lines are the secants.

Combinatory Search: Genetic Algorithm Valid for discrete variables One of the best all purposes search method. Emulates the genetic evolution due to the survival of the fittest Each variable value >a GENE, a binary string value in the variable range Vector variables X> a CHROMOSOME, a concatenation of a random combinations of Genes (strings) one per type (one value per variable). A Chromosome (X i ) is a point in the X domain and is also defined as genotype. Objective Function F(X)>phenotype. F(X i ) is a point in the Objective Function domain corresponding to X i.

Genetic Algorithm Construction of a chromosome X i (x i,y i,z i ) x 1 y 1 z 1 X 1

Genetic Algorithm 1) Construct a population of genotypes (chromosomes) and evaluate the phenotypes (objective function). 2) Associate a fitness value between 0 and 1 to each phenotype with a fitness function. This function normalizes the phenotype (objective function) and assigns to its genotype an estimate (between 0 and 1) of its ability to survive. 3) Reproduction. The ability of a genotype to reproduce is a probabilistic law biased by the value given by the fitness function. Reproduction is done as it follows: Given 2 candidate for reproduction, we have: a) Cloning. The offspring is the same as the parents b) Crossover. Chromosomes are split in two (head and tail) at a random point between genes and rejoined swapping the tails. When crossover is performed Mutation takes place. Each Gene is slightly changed to explore more possible outcomes. 4) Convergence. The algorithm stops when all genes in all individuals are at 95% percentile

Genetic Algorithm. Example

Genetic Algorithm. Example

Genetic Algorithm. Example Results Outcome: