Constrained Functions of N Variables: Non-Gradient Based Methods

Similar documents
Multi-objective Optimization

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

Heuristic Optimisation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

An Introduction to Evolutionary Algorithms

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

Introduction to Genetic Algorithms

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

DERIVATIVE-FREE OPTIMIZATION

OPTIMIZATION OF COMPOSITE WING USING GENETIC ALGORITHM

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

GENETIC ALGORITHM with Hands-On exercise

Genetic Algorithms Variations and Implementation Issues

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Information Fusion Dr. B. K. Panigrahi

Introduction to Design Optimization: Search Methods

Generation of Ultra Side lobe levels in Circular Array Antennas using Evolutionary Algorithms

CS5401 FS2015 Exam 1 Key

Genetic Algorithms for Vision and Pattern Recognition

1 Lab 5: Particle Swarm Optimization

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007)

Evolutionary Algorithms. CS Evolutionary Algorithms 1

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

1 Lab + Hwk 5: Particle Swarm Optimization

ATI Material Do Not Duplicate ATI Material. www. ATIcourses.com. www. ATIcourses.com

Optimization of Association Rule Mining through Genetic Algorithm

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm

Mutations for Permutations

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II

THE DEVELOPMENT OF THE POTENTIAL AND ACADMIC PROGRAMMES OF WROCLAW UNIVERISTY OF TECHNOLOGY METAHEURISTICS

The Genetic Algorithm for finding the maxima of single-variable functions

Using Genetic Algorithms in Integer Programming for Decision Support

Using Genetic Algorithms to Design Experiments: A Review

Grid Scheduling Strategy using GA (GSSGA)

Part II. Computational Intelligence Algorithms

Genetic Algorithms. Kang Zheng Karl Schober

Aero-engine PID parameters Optimization based on Adaptive Genetic Algorithm. Yinling Wang, Huacong Li

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

Study on GA-based matching method of railway vehicle wheels

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction

Optimal Analysis of Economic Load Dispatch using Artificial Intelligence Techniques

1 Lab + Hwk 5: Particle Swarm Optimization

Evolutionary Computation. Chao Lan

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

A NEW APPROACH IN STACKING SEQUENCE OPTIMIZATION OF COMPOSITE LAMINATES USING GENESIS STRUCTURAL ANALYSIS AND OPTIMIZATION SOFTWARE

Automata Construct with Genetic Algorithm

Chapter 14 Global Search Algorithms

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

Comparison between Neighbourhood and Genetic Algorithms on two Analytic Objective Functions and on a 2.5D Synthetic Seismic Inverse Problems

Computational Intelligence

Lab 4: Evolutionary Computation and Swarm Intelligence

Algorithm Design (4) Metaheuristics

Introduction to Optimization

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

The k-means Algorithm and Genetic Algorithm

CHAPTER 1 INTRODUCTION

Multi-Objective Optimization Using Genetic Algorithms

Available online at ScienceDirect. Razvan Cazacu*, Lucian Grama

Genetic Algorithms. Genetic Algorithms

Fast Efficient Clustering Algorithm for Balanced Data

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

Evolutionary Computation Algorithms for Cryptanalysis: A Study

International Journal of Current Research and Modern Education (IJCRME) ISSN (Online): & Impact Factor: Special Issue, NCFTCCPS -

CHAPTER 4 GENETIC ALGORITHM

Genetic Programming. and its use for learning Concepts in Description Logics

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114

C 1 Modified Genetic Algorithm to Solve Time-varying Lot Sizes Economic Lot Scheduling Problem

Artificial Intelligence Application (Genetic Algorithm)

Comparative Study on VQ with Simple GA and Ordain GA

Structural Optimizations of a 12/8 Switched Reluctance Motor using a Genetic Algorithm

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

A Genetic Algorithm Framework

Introduction to Design Optimization: Search Methods

Multi-fidelity optimization of horizontal axis wind turbines

Optimization of Laminar Wings for Pro-Green Aircrafts

Particle Swarm Optimization

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem

GAs for aerodynamic shape design II: multiobjective optimization and multi-criteria design

Minimum Weight Optimization of a Gear Train by using Genetic Algorithm

Genetic Algorithms: Setting Parmeters and Incorporating Constraints OUTLINE OF TOPICS: 1. Setting GA parameters. 2. Constraint Handling (two methods)

AI Programming CS S-08 Local Search / Genetic Algorithms

Introduction to Evolutionary Computation

Witold Pedrycz. University of Alberta Edmonton, Alberta, Canada

Non-deterministic Search techniques. Emma Hart

Constrained Classification of Large Imbalanced Data

Global Optimization. for practical engineering applications. Harry Lee 4/9/2018 CEE 696

METAHEURISTIC. Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal.

Advanced Search Genetic algorithm

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.

GT HEURISTIC FOR SOLVING MULTI OBJECTIVE JOB SHOP SCHEDULING PROBLEMS

Transcription:

onstrained Functions of N Variables: Non-Gradient Based Methods Gerhard Venter Stellenbosch University

Outline Outline onstrained Optimization Non-gradient based methods Genetic Algorithms (GA) Particle Swarm Optimization (PSO) Parallelization 2

ourse Outline Until now hapter 1: Basic concepts hapter 2: Functions of one variable hapter 3: Unconstrained functions of N variables hapter 4: Linear programming hapter 5: Sequential unconstrained minimization techniques Today: onstrained functions of N variables hapter 6: Non-gradient based optimization 3

onstrained Optimization Standard Form Find the set of design variables that will: Minimize: F(X) Subject To: g j (X) 0 j=1,m h k (X)=0 k=1,l Xi L X i Xi U i=1,n 4

Kuhn Tucker onditions Kuhn-Tucker conditions provides necessary conditions for a (local) optimum X isfeasible λ j g j (X )=0 m F(X )+ λ j g j (X )+ j=1 λ j 0 j=1,m λ m+k unrestrictedinsign l λ m+k h k (X )=0 k=1 5

Non-Gradient Based Methods Simplest is a random search Easy to implement Very robust Very inefficient Improve random search by adding some logic Example DOE search with move-limits Referred to as a structured random search We will consider two structured random searches in this class 6

Non-Gradient Based Methods Non-gradient based optimization algorithms have gained a lot of attention recently Easy to program Global properties Require no gradient information High computational cost Tuned for each problem Typically based on some physical phenomena Genetic algorithms Simulated annealing Particle swarm optimization 7

Non-Gradient Based Methods an be classified as a structured random search Does not move from one design point to the next - makes use of a population of design points Numerically robust Increased changes of finding global or near global optimum designs Provides a number of good designs instead of a single optimum 8

Genetic Algorithms Derived from biology and making use of Darwin s principal of survival of the fittest First developed in 1975 by Holland Population adapts to its underlying environment through evolution haracteristics stored in chromosomal strings Structured but random exchange of genetic information Better designs have increased changes of reproducing 9

Genetic Operators Start with initial population Reproduce by using Selection ross-over Occasional mutation Elitist strategy Design variables are encoded in bit-strings that are equivalent to chromosomal strings 10

Design Variable Encoding Traditionally binary numbers were used Example F(X) X={X 1,X 2,X 3,X 4 } 0 X 1,X 4 15 0 X 2 7 0 X 3 3 [ 0110 }{{} X 1 =6 101 }{{} X 2 =5 11 }{{} X 3 =3 1011 }{{} X 4 =11 ] 11

Binary Encoding Mainly used for historical reasons Number of bits required - m 2 m Xu X l X incr +1 Use to represent integer and real numbers Smallest possible alphabet Added complexity that is not needed an use actual values (integer or real) in our bit string 12

Real Number Encoding Laminate stacking sequence optimization Layers can only be stacked using 0,45, 45,90 Bit string for 16-ply laminate [±45 /0 4/90 2] s 13

Real Number Encoding Full string 4 16 =4.3 10 9 combinations [2311114444111132] Use symmetry 4 8 =65,536 combinations [23111144] Balanced with stack variables combinations [2113] 3 4 =81 14

Typical Genetic Algorithm Start reate Initial Swarm Perform analyses onverged No Selection ross-over Mutation Yes End Elitist Strategy 15

Initial Population Initial population Fixed size population Fixed chromosome length No fixed rule for determining population size Randomly distributed throughout the design space 16

Selection and Fitness Selection and fitness Selection gives more fit designs higher chance of passing their genes to the next generation Fitness based on the objective function Use SUMT approach for constrained problems Selection Using roulette wheel approach r i = F i ns j=1 F j 17

Selection Roulette wheel based on objective values All objective function values must be positive Little differentiation between design at end of optimization Use roulette wheel based on rank an take positive and negative objective function values Better differentiation between designs r i = 2(n s i+1) (n s +1)n s 18

Apply Selection reate roulette wheel with n s segments reate random number between 0 and 1 Determine segment on roulette wheel that contains the random number Segment ID corresponds to design ID Repeat to obtain two parents for reproduction through cross-over 19

ross-over over Select two parents for reproduction Apply probability of cross-over, pc Pc is typically close to 1 Mix parent genes using cross-over to obtain one (or two) children One point cross-over Parent 1 82 43 17 11 8 Parent 2 85 31 62 29 66 hild 85 31 17 11 8 20

ross-over over Two point cross-over Parent 1 82 43 17 11 8 Parent 2 85 31 62 29 66 hild 82 43 62 29 8 Uniform cross-over Parent 1 82 43 17 11 8 Parent 2 85 31 62 29 66 Random 0.71 0.13 0.01 0.58 0.30 hild 85 43 17 29 8 21

ross-over over One point averaging cross-over Especially useful for real-numbers Generate random number between 1 and the length of the string Say we generate 3.2 Elements 1 and 2 from Parent 1, elements 4 and 5 from parent 2, element 3 from 3 =0.2P 3 1 +0.8P3 2 Parent 1 82 43 17 11 8 Parent 2 85 31 62 29 66 hild 85 31 53 11 8 22

Mutation Prevent premature loss of genetic information by introducing randomness in the population Important to create information not contained in initial population Apply probability of performing mutation, p m p m is typically small Randomly select gene to mutate Randomly modify gene 23

ost/reliability Algorithm is very costly ost is determined by Problem size number of design variables and number of combinations onstraint definitions Reliability can be improved by Performing more generations Increase the population size Repeat the optimization run 24

Typical Applications Analysis is cheap to perform Problems that contain many local minima Not that common in engineering applications Problems that has no gradient information Selecting different cross-sections for a truss structure ombinatorial type problems omposite laminate design find optimum thickness and ply orientation D-optimal design of experiments 25

Particle Swarm Optimization Particle Swarm Optimization (PSO) is a fairly recent addition to the family of non-gradient based optimization algorithms PSO is based on a simplified social model that is closely tied to swarming theory Example is a swarm of bees searching for a food source Use knowledge of individual particles and swarm as a whole Basic Assumption: A population of individuals adapts by returning to promising regions found previously 26

Algorithm Overview A 2 Initial Swarm A 1 A 2 A 1 P 1 A 1 P 2 A 2 Final Swarm A 1 Particle P i properties: Position: (A 1j,A 2j ) Velocity Vector Objective: W(A 1j,A 2j ) onstraints: S(A 1j,A 2j,P 1,P 2 ) 27

How is PSO Implemented Position Update: Velocity Vector: x i k+1=x i k+v i k+1 t v i k+1=wv i k+c 1 r 1 p i x i k t +c 2 r 2 p g k xi k t Inertia Trust in self Trust in swarm 28

Enhancements onvergence criterion Monitor the maximum change in the objective function for a specified number of consecutive design iterations Integer/Discrete problems Unlike genetic algorithms, PSO is inherently a continuous algorithm Round the new position to closest value before performing function evaluations raziness Similar to mutation operator in Genetic Algorithms Prevent premature convergence 29

onstrained Problems onstrained problems ombine constraints with objective: exterior penalty function: F(X,r p )=F(X)+r p P(X) Deal with violated design points (including points outside bounds) based on usable, feasible directions idea reate new velocity vector to point back to the feasible design space by re-setting the velocity vector: v i k+1=c 1 r 1 p i x i k t +c 2 r 2 p g k xi k t 30

Simplified Flowchart Start reate Initial Swarm Perform analyses onverged No Adjust Parameters New Velocity Vector Update Position Yes End Apply raziness 31

Example Problem 1 Global optimum: Point = (0,0) Obj = 0 Local optima: Point = (-1.75, -0.87) Obj = 0.3 Point = (1.75, 0.87) Obj = 0.3 32

Example Problem 2 Global optimum: Point = (0,0) Obj = 1000 Many local optima 33

Example Problem Multidisciplinary design optimization of a transport aircraft wing structure System level optimization PSO Algorithm Structural sub-optimization GENESIS Aerodynamic analysis Simplified analysis 34

System Level Optimization Unconstrained problem Maximize: Range Design Variables: Aspect ratio Depth-to-chord ratio Number of internal spars Number of internal ribs Wing cover construction ontinuous Variables Discrete Variables 35

Structural Sub-Optimization Structural analysis using a simplified FE model of the wing box Objective is to minimize the weight of the wing Both stress and local buckling constraints are applied 80 Design Variables 206 Design Variables 36

Aerodynamic Analysis Assumed pressure distribution is converted to nodal loads for the structural sub-optimization Simplified drag analysis: Drag of the current wing is calculated based on the total drag of the reference wing Range is calculated from the Breguet formula onstant Take-Off Gross Weight Structural weight saving is converted to fuel weight Wing is build to a jig shape to compensate for the aerodynamic-structure interaction 37

System Level Analysis Update FE Model System Level Analysis Apply Pressure Distribution Structural Sub-Optimization alculate Aerodynamic Drag alculate Range 38

Results Repeated optimization ten times, each with different initial swarm 39

Results... Reference Wing Range = 5000.0 n. mi. A = 6.8571 h/c = 0.12 Optimum Wing (Sandwich) Range = 5375 n. mi. A = 9.2360 h/c = 0.11 40

omputation ost What about using a gradient-based optimizer to solve this problem? Look at all possible combinations for the three truly discrete variables For each combination solve a two design variable continuous optimization problem Results in 32 optimization problems Total of 526 analyses (1500 for the PSO) Parallel implementation implications 41

Gradient-Based Results Parameter Reference Wing DOT Optimum PSO Optimum Aspect Ratio 6.8571 6.9100 9.2360 Depth-to-hord Ratio 0.01 0.08 0.11 Num. Internal Spars 1 0 0 Num. Internal Ribs 7 4 4 onstruction Sandwich Sandwich Range(n. mi.) 5000 5225 5375 Number of Analyses 526 1500 42

Numerical Noise Don t forget about NUMERIAL NOISE Severe numerical noise due to the structural sub-optimization Noise due to different levels of convergence Iterative process 43

Remarks PSO algorithm was able to produce consistent results in the presence of truly discrete design variables and severe numerical noise PSO computational cost is high, but using a gradient based optimizer is not a feasible alternative Numerical noise Implementation issues Larger number of discrete combinations PSO has significant potential for massively parallel processing 44

Simplified Flowchart Start reate Initial Swarm Perform analyses Run in Parallel onverged No Adjust Parameters New Velocity Vector Update Position Yes End Apply raziness 45

Parallel PSO: Synchronized Wait for all analyses to complete before starting next design iteration What has been done so far Straight forward to implement Overall logic of the algorithm is not changed produce same results Poor parallel speedup Some processors remain idle part-time! 46

Parallel PSO: Asynchronized Goal is to make all processors work all of the time Do not wait for the current design iteration to complete before starting the next Update all point information as soon as it becomes available Update iteration information as soon as a design iteration is completed Overall logic of algorithm is changed may produce different results Very good parallel speedup 47

Parallel Environment Implemented both synchronous and asynchronous algorithms Made use of a master-slave implementation System X from the Virginia Tech Terrascale omputing Facility 1100 dual processor 2.3 GHz compute nodes 4GB of RAM and 40 GB hard disk per node Used up to 32 processors Each run was repeated 10 times to account for variability in the results 48

Parallel Results: Accuracy All results averaged over 10 runs 49

Parallel Results: Efficiency Speedup = Time Time 1Proc nproc Time Speedup = Time 1Proc nproc Efficiency =100* Speedup nproc 50

Remarks PSO algorithm was able to produce consistent results in the presence of discrete design variables and severe numerical noise PSO has significant potential for parallel processing Implementation is important Hardware is important Asynchronous implementation provides almost perfect speedup 95% Efficiency with 32 processors Average time reduced from 150min (2.5 Hours) to 5 min 51

oncluding Remarks Non-gradient based algorithms have a valuable place in design optimization These algorithms are often misused and should be applied with great care Rule of thumb is to only apply these algorithms if there is no alternative High computational cost can be alleviated by using parallel computing 52