Blackbox optimization with the MADS algorithm and the NOMAD software
|
|
- Christian Mason
- 5 years ago
- Views:
Transcription
1 Blackbox optimization with the MADS algorithm and the NOMAD software Sébastien Le Digabel, Charles Audet, Viviane Rochon-Montplaisir, Christophe Tribes IREQ, NOMAD: Blackbox Optimization 1/41
2 Presentation outline Research team Blackbox optimization The MADS algorithm The NOMAD software package Conclusion NOMAD: Blackbox Optimization 2/41
3 Research team Blackbox optimization The MADS algorithm The NOMAD software package Conclusion NOMAD: Blackbox Optimization 3/41
4 Research team Blackbox optimization MADS NOMAD Conclusion Research team Professors Research associates Students Recently graduated NOMAD: Blackbox Optimization 4/41
5 Research team Blackbox optimization The MADS algorithm The NOMAD software package Conclusion NOMAD: Blackbox Optimization 5/41
6 Derivative-free & blackbox optimization Algorithms DFO / BBO Applications Theory NOMAD: Blackbox Optimization 6/41
7 Derivative-free & blackbox optimization Algorithms Tsunami detection buoys Fontan surgery Wing design MDO GMON positioning Hyperparameters optimization Applications Nonsmooth calculus Generalized gradient Clarkes derivatives Hypertangent cone Optimality conditions DFO / BBO Hidden constraints Stochasticity Categorical variables Biobjective problems Surrogates Theory NOMAD: Blackbox Optimization 6/41
8 Derivative-free & blackbox optimization Algorithms Tsunami detection buoys Fontan surgery Wing design MDO GMON positioning Hyperparameters optimization Applications Nonsmooth calculus Generalized gradient Clarkes derivatives Hypertangent cone Optimality conditions DFO / BBO Hidden constraints Stochasticity Categorical variables Biobjective problems Surrogates Theory NOMAD: Blackbox Optimization 6/41
9 Blackbox optimization We consider min x Ω f(x) where the evaluations of f and the functions defining Ω are the result of a computer simulation (a blackbox). x R n f(x) x Ω? NOMAD: Blackbox Optimization 7/41
10 Blackbox optimization We consider min x Ω f(x) where the evaluations of f and the functions defining Ω are the result of a computer simulation (a blackbox). x R n f(x) x Ω? Each call to the simulation may be expensive. The simulation can fail. Sometimes f(x) f(x). Derivatives are not available and cannot be approximated. NOMAD: Blackbox Optimization 7/41
11 Tsunamis detection buoys (2004) Source: ms1.html NOMAD: Blackbox Optimization 8/41
12 Variables The variables x may be a mix of Reals. Integers, binaries, granular, Categorical. Example: The number of heat shields is a variable. Types of materials are also variables. NOMAD: Blackbox Optimization 9/41
13 Constraints Domain: Ω = {x X : c j (x) 0, j J} R n X corresponds to unrelaxable constraints Cannot be violated; Example: x > 0 when log x is used inside the simulation. NOMAD: Blackbox Optimization 10/41
14 Constraints Domain: Ω = {x X : c j (x) 0, j J} R n X corresponds to unrelaxable constraints c j (x) 0: Relaxable and quantifiable constraints May be violated at intermediate designs; c j (x) measures the violation; Example: cost budget. NOMAD: Blackbox Optimization 10/41
15 Constraints Domain: Ω = {x X : c j (x) 0, j J} R n X corresponds to unrelaxable constraints c j (x) 0: Relaxable and quantifiable constraints Hidden constraints when the simulation fails, even for points in Ω; Example: Segmentation fault Bus error ERROR 42 DIVISION BY ZERO NOMAD: Blackbox Optimization 10/41
16 Constraints Domain: Ω = {x X : c j (x) 0, j J} R n X corresponds to unrelaxable constraints c j (x) 0: Relaxable and quantifiable constraints Hidden constraints Example: Chemical process: 7 variables, 4 constraints. The ASPEN software fails on 43% of the calls. NOMAD: Blackbox Optimization 10/41
17 Research team Blackbox optimization The MADS algorithm The NOMAD software package Conclusion NOMAD: Blackbox Optimization 11/41
18 General framework f(x) x Ω? Algorithm x NOMAD: Blackbox Optimization 12/41
19 Mesh Adaptive Direct Search (MADS) [Audet and Dennis, Jr., 2006]. Iterative algorithm that evaluates the blackbox at some trial points on a spatial discretization called the mesh. One iteration = search and poll. The search allows trial points generated anywhere on the mesh. The poll consists in generating a list of trial points constructed from poll directions. These directions grow dense. At the end of the iteration, the mesh size is reduced if no new success point is found. NOMAD: Blackbox Optimization 13/41
20 [0] Initializations (x 0, m 0 ) [1] Iteration k [1.1] Search select a finite number of mesh points evaluate candidates opportunistically [1.2] Poll (if the Search failed) construct poll set P k = {x k + m k d : d D k} sort(p k ) evaluate candidates opportunistically [2] Updates if success x k+1 success point increase m k else x k+1 x k decrease m k k k + 1, stop or go to [1] From the MADS original paper [Audet and Dennis, Jr., 2006]. NOMAD: Blackbox Optimization 14/41
21 Poll illustration (successive fails and mesh shrink) m k = 1 p 1 x k p 3 p 2 trial points={p 1, p 2, p 3 } NOMAD: Blackbox Optimization 15/41
22 Poll illustration (successive fails and mesh shrink) p 1 m k = 1 x k p 3 m k+1 = 1/4 x k p 4 p 5 p 6 p 2 trial points={p 1, p 2, p 3 } = {p 4, p 5, p 6 } NOMAD: Blackbox Optimization 15/41
23 Poll illustration (successive fails and mesh shrink) m k = 1 m k+1 = 1/4 m k+2 = 1/16 p 1 x k p 3 x k p 4 p 5 p 6 p 7 x k p 8 p 9 p 2 trial points={p 1, p 2, p 3 } = {p 4, p 5, p 6 } = {p 7, p 8, p 9 } NOMAD: Blackbox Optimization 15/41
24 Hierarchical convergence Iterates are bounded x k ˆx mesh min. k 0 NOMAD: Blackbox Optimization 16/41
25 Hierarchical convergence Iterates are bounded Lower semicontinous x k ˆx mesh min. k 0 f(ˆx) f(x k ) NOMAD: Blackbox Optimization 16/41
26 Hierarchical convergence Iterates are bounded Lower semicontinous Directional Lipschitz Custódio et Vicente 2012 x k ˆx mesh min. k 0 f(ˆx) f(x k ) f (ˆx; d) 0 NOMAD: Blackbox Optimization 16/41
27 Hierarchical convergence Iterates are bounded Lower semicontinous Directional Lipschitz Custódio et Vicente 2012 x k ˆx mesh min. k 0 f(ˆx) f(x k ) f (ˆx; d) 0 NOMAD: Blackbox Optimization 16/41
28 Hierarchical convergence Iterates are bounded Lower semicontinous Directional Lipschitz Custódio et Vicente 2012 Lipschitz x k ˆx mesh min. k 0 f(ˆx) f(x k ) f (ˆx; d) 0 f (ˆx; d) 0 NOMAD: Blackbox Optimization 16/41
29 Hierarchical convergence Iterates are bounded Lower semicontinous x k ˆx Directional Lipschitz mesh min. Custódio et Vicente 2012 k 0 Lipschitz f(ˆx) f(x k ) Regular f (ˆx; d) 0 f (ˆx; d) 0 f (ˆx; d) 0 NOMAD: Blackbox Optimization 16/41
30 Hierarchical convergence Iterates are bounded Lower semicontinous x k ˆx Directional Lipschitz mesh min. Custódio et Vicente 2012 k 0 Lipschitz f(ˆx) f(x k ) Regular f (ˆx; d) 0 Strictly f (ˆx; d) 0 differentiable f (ˆx; d) 0 KKT NOMAD: Blackbox Optimization 16/41
31 MADS extensions Constraints handling with the Progressive Barrier technique [Audet and Dennis, Jr., 2009]. Surrogates [Talgorn et al., 2015]. Categorical variables [Abramson, 2004]. Granular and discrete variables [Audet et al., 2018]. Global optimization [Audet et al., 2008a]. Parallelism [Le Digabel et al., 2010, Audet et al., 2008b]. Multiobjective optimization [Audet et al., 2008c]. Sensitivity analysis [Audet et al., 2012]. NOMAD: Blackbox Optimization 17/41
32 Surrogates Static surrogate: A cheaper model defined a priori by the user. It is used as a blackbox. Typically a simplified physics model. Variable precision is not yet considered. Dynamic surrogate: Model managed by the algorithm, based on past evaluations. It can be periodically updated. NOMAD: Blackbox Optimization 18/41
33 General framework with surrogates f(x) x Ω? Algorithm x R n NOMAD: Blackbox Optimization 19/41
34 General framework with surrogates x R n Surrogate Algorithm f(x) x Ω? f s (x) x Ω s? NOMAD: Blackbox Optimization 19/41
35 General framework with surrogates x R n Surrogate Update Algorithm f(x) x Ω? f s (x) x Ω s? NOMAD: Blackbox Optimization 19/41
36 Current projects funded by HQ and Rio Tinto Opportunism and ordering strategies [completed] Discrete and granular variables [completed] Hierarchical functions [to do] Important variables [to do] Robust optimization [in progress] Parallel decomposition [completed] Grey boxes [completed] Multi-precision surrogates [in progress] Development of NOMAD [in progress] NOMAD: Blackbox Optimization 20/41
37 Example of project: Parallel decomposition PSD-MADS algorithm (parallel-space decomposition of MADS). PhD student Nadir Amaioua. Latest version designed and tested at IREQ. Parallelism with MPI. Master-slaves paradigm. Runs on CASIR. Motivation: Problems with large numbers of variables (up to 4,000 in our latest tests). Idea: Slaves optimize on subspaces where large number of variables are fixed. Machine learning technique are used to build sensitivity matrices and decide the subspaces. NOMAD: Blackbox Optimization 21/41
38 Research team Blackbox optimization The MADS algorithm The NOMAD software package Conclusion NOMAD: Blackbox Optimization 22/41
39 NOMAD (Nonlinear Optimization with MADS) C++ implementation of MADS. Standard C++, no other package needed. Parallel versions with MPI. Runs on Linux, Mac OS X and Windows. MATLAB versions; Multiple interfaces (Python, Excel, etc.) Command-line and library interfaces. Distributed under the LGPL license. Complete user guide available in the package. Doxygen documentation available online. Download at NOMAD: Blackbox Optimization 23/41
40 NOMAD: History and team Developed since Current version: 3.8. Algorithm designers: M. Abramson, C. Audet, J. Dennis, S. Le Digabel, C. Tribes, V. Rochon-Montplaisir. Developers: Versions 1 and 2: G. Couture. Version 3 (2008): S. Le Digabel, C. Tribes. Version 4 (2019): C. Tribes, V. Rochon-Montplaisir. Support at nomad@gerad.ca. Related article in TOMS [Le Digabel, 2011]. NOMAD: Blackbox Optimization 24/41
41 More than 9,000 certified downloads since Softpedia Sourceforge Gerad Total ,800 1,600 1,400 1,200 1, NOMAD: Blackbox Optimization 25/41
42 Users from Sept to Sept (downloads from the GERAD website only) Industries: Hydro-Québec (IREQ), Rio Tinto, Bombardier, Airbus, Boeing, United Airlines, Siemens, Schneider Electric, GHD Toronto, Energen, Skyconseil, German Aerospace Center, Moscow Power Engineering Institute, Newmerical Technologies, Disney Research, US Federal Reserve Board, Softree, Aditazz, Essar Steel, Tatura Milk Industries, Westrock, Market Appeal, Corona Insights, The Climate Corporation. National Labs: Argonne, Sandia, Los Alamos. Universities. NOMAD: Blackbox Optimization 26/41
43 Main functionalities (1/2) Single or biobjective optimization. Variables: Continuous, integer, binary, categorical. Periodic. Fixed. Groups of variables. Searches: Latin-Hypercube (LH). Variable Neighborhood Search (VNS). Quadratic models. Statistical surrogates. User search. NOMAD: Blackbox Optimization 27/41
44 Main functionalities (2/2) Constraints treated with 4 different methods: Progressive Barrier (default). Extreme Barrier. Progressive-to-Extreme Barrier. Filter method. Several direction types: Coordinate directions. LT-MADS. OrthoMADS. Hybrid combinations. Sensitivity analysis. (all items correspond to published or submitted papers). NOMAD: Blackbox Optimization 28/41
45 NOMAD installation Pre-compiled executables are available for Windows and Mac. Installation programs copy these executables. On Unix/Linux, after download, launch an installation script. Two ways to use NOMAD: batch mode or library mode. NOMAD: Blackbox Optimization 29/41
46 Blackbox conception (batch mode) Command-line program that takes in argument a file containing x, and displays the values of f(x) and the c j (x) s. Can be coded in any language. Typically: > bb.exe x.txt displays f c1 c2 (objective and two constraints). NOMAD: Blackbox Optimization 30/41
47 Important parameters Necessary parameters: Blackbox characteristics (dimension n, number of constraints, etc.), starting point (x 0 ). All algorithmic parameters have default values. The most important are: Maximum number of blackbox evaluations. Starting point (more than one can be defined). Types of directions (more than one can be defined). Initial mesh size. Constraint types. Surrogate searches. Seeds. See the user guide for the description of all parameters, or use the nomad -h option. NOMAD: Blackbox Optimization 31/41
48 Run NOMAD > nomad parameters.txt NOMAD: Blackbox Optimization 32/41
49 Advanced functionalities (library mode) No system calls: the code executes faster. The user can program a custom search strategy. The user can pre-process all evaluation points before they are evaluated. The user can decide the priority in which trial points are evaluated. The user can define callback functions that will be called at some specific events (new success, new iteration, new MADS run in bi-objective optimization) NOMAD: Blackbox Optimization 33/41
50 Examples included in the NOMAD package Simple examples: How to run NOMAD. Usage of parallellism. Categorical variables. MATLAB/Python interfaces. Advanced examples to illustrate additional possibilities: Multi-start from points generated with LH sampling. Problems used in library mode and coded as: A Windows DLL. A GAMS program. A CUTEst problem. A FORTRAN code. A GUI prototype in JAVA. NOMAD: Blackbox Optimization 34/41
51 Other MADS distributions MATLAB version within the Opti Toolbox package. Available in the MATLAB Optimization Toolbox. Old version, not maintained. NOMADM by M. Abramson. Old version, not maintained. Excel with the OpenSolver tool. (GPLv3). NOMAD: Blackbox Optimization 35/41
52 NOMAD 4 Motivated by the collaboration with Hydro-Québec and Rio Tinto. New Architecture. Emphasis on modularity and simplicity. Based on the book [Audet and Hare, 2017]. Avoid computation bottlenecks (continuous evaluations; processes do not wait on each other). Modify algorithm parameters while optimizing. Independent library of surrogates (SGTELIB). Usage of parallelism (detailed next slide). NOMAD: Blackbox Optimization 36/41
53 Parallelism in NOMAD 4 Blackbox level: Blackbox uses parallelism when available. Computer level: Launch multiple blackbox processes on different CPUs (OpenMP and MPI). Cluster level: Launch multiple blackbox processes on different machines (MPI). Blocks of trial points. Ability to feed a cluster in evaluations. Algorithmic improvements to produce more points for evaluation. Parallel space decomposition for problems of large dimension. NOMAD: Blackbox Optimization 37/41
54 Research team Blackbox optimization The MADS algorithm The NOMAD software package Conclusion NOMAD: Blackbox Optimization 38/41
55 Summary Blackbox optimization motivated by industrial applications. Algorithmic features backed by mathematical convergence analyses and published in optimization journals. NOMAD: Software package implementing MADS. Open source; LGPL license. Features: Constraints, biobjective, global optimization, surrogates, several types of variables, parallelism. Fast support at NOMAD can be customized through collaborations. NOMAD: Blackbox Optimization 39/41
56 References I Abramson, M. (2004). Mixed variable optimization of a Load-Bearing thermal insulation system using a filter pattern search algorithm. Optimization and Engineering, 5(2): Audet, C., Béchard, V., and Le Digabel, S. (2008a). Nonsmooth optimization through mesh adaptive direct search and variable neighborhood search. Journal of Global Optimization, 41(2): Audet, C. and Dennis, Jr., J. (2006). Mesh adaptive direct search algorithms for constrained optimization. SIAM Journal on Optimization, 17(1): Audet, C. and Dennis, Jr., J. (2009). A Progressive Barrier for Derivative-Free Nonlinear Programming. SIAM Journal on Optimization, 20(1): Audet, C., Dennis, Jr., J., and Le Digabel, S. (2008b). Parallel space decomposition of the mesh adaptive direct search algorithm. SIAM Journal on Optimization, 19(3): Audet, C., Dennis, Jr., J., and Le Digabel, S. (2012). Trade-off studies in blackbox optimization. Optimization Methods and Software, 27(4 5): NOMAD: Blackbox Optimization 40/41
57 References II Audet, C., Digabel, S. L., and Tribes, C. (2018). The mesh adaptive direct search algorithm for granular and discrete variables. Technical Report G , Les cahiers du GERAD. Audet, C. and Hare, W. (2017). Derivative-Free and Blackbox Optimization. Springer Series in Operations Research and Financial Engineering. Springer International Publishing. Audet, C., Savard, G., and Zghal, W. (2008c). Multiobjective optimization through a series of single-objective formulations. SIAM Journal on Optimization, 19(1): Le Digabel, S. (2011). Algorithm 909: NOMAD: Nonlinear Optimization with the MADS algorithm. ACM Transactions on Mathematical Software, 37(4):44:1 44:15. Le Digabel, S., Abramson, M., Audet, C., and Dennis, Jr., J. (2010). Parallel Versions of the MADS Algorithm for Black-Box Optimization. In Optimization days, Montreal. GERAD. Slides available at Talgorn, B., Le Digabel, S., and Kokkolaras, M. (2015). Statistical Surrogate Formulations for Simulation-Based Design Optimization. Journal of Mechanical Design, 137(2): NOMAD: Blackbox Optimization 41/41
NOMAD, a blackbox optimization software
NOMAD, a blackbox optimization software Sébastien Le Digabel, Charles Audet, Viviane Rochon-Montplaisir, Christophe Tribes EUROPT, 2018 07 12 NOMAD: Blackbox Optimization 1/27 Presentation outline Blackbox
More informationBlackbox Optimization: Algorithm and Applications
Blackbox Optimization: Algorithm and Applications Sébastien Le Digabel Charles Audet Christophe Tribes GERAD and École Polytechnique de Montréal 7e colloque CAO/FAO, 2016 05 04 BBO: Algorithm and Applications
More informationBlackbox Optimization
Blackbox Optimization Algorithm and Applications Sébastien Le Digabel Charles Audet Christophe Tribes GERAD and École Polytechnique de Montréal Workshop on Nonlinear Optimization Algorithms and Industrial
More informationBlack-Box Optimization with the NOMAD Software
ISMP 2009 Ecole Polytechnique de Montréal Black-Box Optimization with the NOMAD Software Sébastien Le Digabel Charles Audet John Dennis Quentin Reynaud 2009 08 25 NOMAD: www.gerad.ca/nomad 1/23 Presentation
More informationThe Mesh Adaptive Direct Search Algorithm for Discrete Blackbox Optimization
The Mesh Adaptive Direct Search Algorithm for Discrete Blackbox Optimization Sébastien Le Digabel Charles Audet Christophe Tribes GERAD and École Polytechnique de Montréal ICCOPT, Tokyo 2016 08 08 BBO:
More informationParallel Versions of the MADS Algorithm for Black-Box Optimization
JOPT 2010 Ecole Polytechnique de Montréal Parallel Versions of the MADS Algorithm for Black-Box Optimization Sébastien Le Digabel Mark A. Abramson Charles Audet John E. Dennis 2010 05 10 JOPT 2010: black-box
More informationSensitivity to constraints in blackbox optimization
Sensitivity to constraints in blackbox optimization Charles Audet J.E. Dennis Jr. Sébastien Le Digabel August 26, 2010 Abstract: The paper proposes a framework for sensitivity analyses of blackbox constrained
More informationHandling of constraints
Handling of constraints MTH6418 S. Le Digabel, École Polytechnique de Montréal Fall 2015 (v3) MTH6418: Constraints 1/41 Plan Taxonomy of constraints Approaches The Progressive Barrier (PB) References MTH6418:
More informationOptimization Using Surrogates for Engineering Design
1 Optimization Using Surrogates for Engineering Design Mark Abramson, USAF Institute of Technology Charles Audet, Gilles Couture, École Polytecnique. de Montréal John Dennis, Rice University Andrew Booker,
More informationNew users of NOMAD: Chapter 2 describes how to install the software application. Chapter 3 describes how to get started with NOMAD.
NOMAD User Guide Version 3.8.1 Sébastien Le Digabel, Christophe Tribes and Charles Audet How to use this guide: A general introduction of NOMAD is presented in Chapter 1. New users of NOMAD: Chapter 2
More informationDynamic scaling in the Mesh Adaptive Direct Search algorithm for blackbox optimization
Dynamic scaling in the Mesh Adaptive Direct Search algorithm for blackbox optimization Charles Audet Sébastien Le Digabel Christophe Tribes March 31, 2014 Abstract. Blackbox optimization deals with situations
More informationRobust optimization of noisy blackbox problems using the Mesh Adaptive Direct Search algorithm
Optimization Letters manuscript No. (will be inserted by the editor) Robust optimization of noisy blackbox problems using the Mesh Adaptive Direct Search algorithm Charles Audet Amina Ihaddadene Sébastien
More informationBlackbox optimization of inflow scenario trees
Blackbox optimization of inflow scenario trees Sara Séguin, Ph.D. student 1 Charles Audet, Advisor 1 Pascal Côté, Co-advisor 2 1 École Polytechnique de Montréal, Canada and GERAD 2 Rio Tinto, Saguenay,
More informationParallel Direct Search in Structural Optimization
Parallel Direct Search in Structural Optimization J.B. Cardoso 1, P.G. Coelho 1 and A.L. Custódio 2, 1 Department of Mechanical and Industrial Engineering, New University of Lisbon, Portugal, 2 CMA and
More informationUSE OF QUADRATIC MODELS WITH MESH ADAPTIVE DIRECT SEARCH FOR CONSTRAINED BLACK BOX OPTIMIZATION. minf(x) (1.1)
USE OF QUADRATIC MODELS WITH MESH ADAPTIVE DIRECT SEARCH FOR CONSTRAINED BLACK BOX OPTIMIZATION ANDREW R. CONN AND SÉBASTIEN LE DIGABEL Abstract: We consider derivative-free optimization, and in particular
More informationCalibration of Driving Behavior Models using Derivative-Free Optimization and Video Data for Montreal Highways
Calibration of Driving Behavior Models using Derivative-Free Optimization and Video Data for Montreal Highways Laurent Gauthier, ing. jr., Master s Student (Corresponding author) Department of Civil, Geological
More informationProblem Formulations for Simulation-based Design Optimization using Statistical Surrogates and Direct Search
Problem Formulations for Simulation-based Design Optimization using Statistical Surrogates and Direct Search Bastien Talgorn Sébastien Le Digabel Michael Kokkolaras February 17, 214 Abstract Typical challenges
More informationA progressive barrier derivative-free trust-region algorithm for constrained optimization
A progressive barrier derivative-free trust-region algorithm for constrained optimization Charles Audet Andrew R. Conn Sébastien Le Digabel Mathilde Peyrega June 28, 2016 Abstract: We study derivative-free
More informationLes Cahiers du GERAD ISSN:
Les Cahiers du GERAD ISSN: 0711 2440 NOMAD User Guide Version 3.7.2 S. Le Digabel, C. Tribes, C. Audet G 2009 37 July 2009 Revised: August 2015 Les textes publiés dans la série des rapports de recherche
More informationMethodologies and Software for Derivative-free Optimization
Methodologies and Software for Derivative-free Optimization A. L. Custódio 1 K. Scheinberg 2 L. N. Vicente 3 March 14, 2017 1 Department of Mathematics, FCT-UNL-CMA, Quinta da Torre, 2829-516 Caparica,
More informationA derivative-free trust-region augmented Lagrangian algorithm
A derivative-free trust-region augmented Lagrangian algorithm Charles Audet Sébastien Le Digabel Mathilde Peyrega July 5, 2016 Abstract We present a new derivative-free trust-region (DFTR) algorithm to
More informationA penalty based filters method in direct search optimization
A penalty based filters method in direct search optimization ALDINA CORREIA CIICESI/ESTG P.PORTO Felgueiras PORTUGAL aic@estg.ipp.pt JOÃO MATIAS CM-UTAD Vila Real PORTUGAL j matias@utad.pt PEDRO MESTRE
More informationMATLAB Based Optimization Techniques and Parallel Computing
MATLAB Based Optimization Techniques and Parallel Computing Bratislava June 4, 2009 2009 The MathWorks, Inc. Jörg-M. Sautter Application Engineer The MathWorks Agenda Introduction Local and Smooth Optimization
More informationPreface. and Its Applications 81, ISBN , doi: / , Springer Science+Business Media New York, 2013.
Preface This book is for all those interested in using the GAMS technology for modeling and solving complex, large-scale, continuous nonlinear optimization problems or applications. Mainly, it is a continuation
More informationSnow water equivalent estimation using blackbox optimization
Snow water equivalent estimation using blackbox optimization Stéphane Alarie Charles Audet Vincent Garnier Sébastien Le Digabel Louis-Alexandre Leclaire February 23, 2011 Abstract: Accurate measurements
More informationSolution Methods Numerical Algorithms
Solution Methods Numerical Algorithms Evelien van der Hurk DTU Managment Engineering Class Exercises From Last Time 2 DTU Management Engineering 42111: Static and Dynamic Optimization (6) 09/10/2017 Class
More informationCasADi tutorial Introduction
Lund, 6 December 2011 CasADi tutorial Introduction Joel Andersson Department of Electrical Engineering (ESAT-SCD) & Optimization in Engineering Center (OPTEC) Katholieke Universiteit Leuven OPTEC (ESAT
More informationA penalty based filters method in direct search optimization
A penalty based filters method in direct search optimization Aldina Correia CIICESI / ESTG P.PORTO Felgueiras, Portugal aic@estg.ipp.pt João Matias CM-UTAD UTAD Vila Real, Portugal j matias@utad.pt Pedro
More informationWebinar. Machine Tool Optimization with ANSYS optislang
Webinar Machine Tool Optimization with ANSYS optislang 1 Outline Introduction Process Integration Design of Experiments & Sensitivity Analysis Multi-objective Optimization Single-objective Optimization
More informationA Comparison of Derivative-Free Optimization Methods for Groundwater Supply and Hydraulic Capture Community Problems
A Comparison of Derivative-Free Optimization Methods for Groundwater Supply and Hydraulic Capture Community Problems K. R. Fowler a, J. P. Reese b C. E. Kees c J. E. Dennis, Jr., d C. T. Kelley e C. T.
More informationDesign of a control system model in SimulationX using calibration and optimization. Dynardo GmbH
Design of a control system model in SimulationX using calibration and optimization Dynardo GmbH 1 Notes Please let your microphone muted Use the chat window to ask questions During short breaks we will
More informationA Derivative-Free Approximate Gradient Sampling Algorithm for Finite Minimax Problems
1 / 33 A Derivative-Free Approximate Gradient Sampling Algorithm for Finite Minimax Problems Speaker: Julie Nutini Joint work with Warren Hare University of British Columbia (Okanagan) III Latin American
More informationConvex and Distributed Optimization. Thomas Ropars
>>> Presentation of this master2 course Convex and Distributed Optimization Franck Iutzeler Jérôme Malick Thomas Ropars Dmitry Grishchenko from LJK, the applied maths and computer science laboratory and
More information1. Introduction. performance of numerical methods. complexity bounds. structural convex optimization. course goals and topics
1. Introduction EE 546, Univ of Washington, Spring 2016 performance of numerical methods complexity bounds structural convex optimization course goals and topics 1 1 Some course info Welcome to EE 546!
More informationMulti-Objective Memetic Algorithm using Pattern Search Filter Methods
Multi-Objective Memetic Algorithm using Pattern Search Filter Methods F. Mendes V. Sousa M.F.P. Costa A. Gaspar-Cunha IPC/I3N - Institute of Polymers and Composites, University of Minho Guimarães, Portugal
More informationOn the Global Solution of Linear Programs with Linear Complementarity Constraints
On the Global Solution of Linear Programs with Linear Complementarity Constraints J. E. Mitchell 1 J. Hu 1 J.-S. Pang 2 K. P. Bennett 1 G. Kunapuli 1 1 Department of Mathematical Sciences RPI, Troy, NY
More informationA NEW SEQUENTIAL CUTTING PLANE ALGORITHM FOR SOLVING MIXED INTEGER NONLINEAR PROGRAMMING PROBLEMS
EVOLUTIONARY METHODS FOR DESIGN, OPTIMIZATION AND CONTROL P. Neittaanmäki, J. Périaux and T. Tuovinen (Eds.) c CIMNE, Barcelona, Spain 2007 A NEW SEQUENTIAL CUTTING PLANE ALGORITHM FOR SOLVING MIXED INTEGER
More informationQuasi-Monte Carlo Methods Combating Complexity in Cost Risk Analysis
Quasi-Monte Carlo Methods Combating Complexity in Cost Risk Analysis Blake Boswell Booz Allen Hamilton ISPA / SCEA Conference Albuquerque, NM June 2011 1 Table Of Contents Introduction Monte Carlo Methods
More informationA direct search method for smooth and nonsmooth unconstrained optimization
ANZIAM J. 48 (CTAC2006) pp.c927 C948, 2008 C927 A direct search method for smooth and nonsmooth unconstrained optimization C. J. Price 1 M. Reale 2 B. L. Robertson 3 (Received 24 August 2006; revised 4
More informationPrinciples of Wireless Sensor Networks. Fast-Lipschitz Optimization
http://www.ee.kth.se/~carlofi/teaching/pwsn-2011/wsn_course.shtml Lecture 5 Stockholm, October 14, 2011 Fast-Lipschitz Optimization Royal Institute of Technology - KTH Stockholm, Sweden e-mail: carlofi@kth.se
More informationBi-objective optimization for seismic survey design
Bi-objective optimization for seismic survey design Jorge E. Monsegny University of Calgary - CREWES Summary I applied a bi-objective optimization strategy to search the best seismic survey design in illumination
More informationMeta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization
2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic
More informationDERIVATIVE-FREE OPTIMIZATION ENHANCED-SURROGATE MODEL DEVELOPMENT FOR OPTIMIZATION. Alison Cozad, Nick Sahinidis, David Miller
DERIVATIVE-FREE OPTIMIZATION ENHANCED-SURROGATE MODEL DEVELOPMENT FOR OPTIMIZATION Alison Cozad, Nick Sahinidis, David Miller Carbon Capture Challenge The traditional pathway from discovery to commercialization
More informationSID-PSM: A PATTERN SEARCH METHOD GUIDED BY SIMPLEX DERIVATIVES FOR USE IN DERIVATIVE-FREE OPTIMIZATION
SID-PSM: A PATTERN SEARCH METHOD GUIDED BY SIMPLEX DERIVATIVES FOR USE IN DERIVATIVE-FREE OPTIMIZATION A. L. CUSTÓDIO AND L. N. VICENTE Abstract. SID-PSM (version 1.3) is a suite of MATLAB [1] functions
More informationCollision Detection between Dynamic Rigid Objects and Static Displacement Mapped Surfaces in Computer Games
between Dynamic Rigid Objects and Static Displacement Mapped Surfaces in Computer Games Author:, KTH Mentor: Joacim Jonsson, Avalanche Studios Supervisor: Prof. Christopher Peters, KTH 26 June, 2015 Overview
More informationGlobal Solution of Mixed-Integer Dynamic Optimization Problems
European Symposium on Computer Arded Aided Process Engineering 15 L. Puigjaner and A. Espuña (Editors) 25 Elsevier Science B.V. All rights reserved. Global Solution of Mixed-Integer Dynamic Optimization
More informationOptimization in MATLAB Seth DeLand
Optimization in MATLAB Seth DeLand 4 The MathWorks, Inc. Topics Intro Using gradient-based solvers Optimization in Comp. Finance toolboxes Global optimization Speeding up your optimizations Optimization
More informationTHE USE OF OPENCAVOK FINITE ELEMENT SOFTWARE IN UNDERGRADUATED COURSES OF MECHANICAL ENGINEERING
THE USE OF OPENCAVOK FINITE ELEMENT SOFTWARE IN UNDERGRADUATED COURSES OF MECHANICAL ENGINEERING Pierre Lamary Roberto de Araújo Bezerra Ranon Bezerra Department of Mechanical Engineering, Federal University
More informationVariable Neighborhood Search Based Algorithm for University Course Timetabling Problem
Variable Neighborhood Search Based Algorithm for University Course Timetabling Problem Velin Kralev, Radoslava Kraleva South-West University "Neofit Rilski", Blagoevgrad, Bulgaria Abstract: In this paper
More informationMultiobjective Optimization of an Axisymmetric Supersonic-Inlet Bypass- Duct Splitter via Surrogate Modeling
Multiobjective Optimization of an Axisymmetric Supersonic-Inlet Bypass- Duct Splitter via Surrogate Modeling Jacob C. Haderlie jhaderli@purdue.edu Outline Motivation Research objectives and problem formulation
More informationECS289: Scalable Machine Learning
ECS289: Scalable Machine Learning Cho-Jui Hsieh UC Davis Sept 22, 2016 Course Information Website: http://www.stat.ucdavis.edu/~chohsieh/teaching/ ECS289G_Fall2016/main.html My office: Mathematical Sciences
More informationA Brief Look at Optimization
A Brief Look at Optimization CSC 412/2506 Tutorial David Madras January 18, 2018 Slides adapted from last year s version Overview Introduction Classes of optimization problems Linear programming Steepest
More informationFast-Lipschitz Optimization
Fast-Lipschitz Optimization DREAM Seminar Series University of California at Berkeley September 11, 2012 Carlo Fischione ACCESS Linnaeus Center, Electrical Engineering KTH Royal Institute of Technology
More informationSolving Computationally Expensive Optimization Problems with CPU Time-Correlated Functions
Noname manuscript No. (will be inserted by the editor) Solving Computationally Expensive Optimization Problems with CPU Time-Correlated Functions Mark A. Abramson Thomas J. Asaki John E. Dennis, Jr. Raymond
More informationIntroduction to Multigrid and its Parallelization
Introduction to Multigrid and its Parallelization! Thomas D. Economon Lecture 14a May 28, 2014 Announcements 2 HW 1 & 2 have been returned. Any questions? Final projects are due June 11, 5 pm. If you are
More informationAn Extension of the Multicut L-Shaped Method. INEN Large-Scale Stochastic Optimization Semester project. Svyatoslav Trukhanov
An Extension of the Multicut L-Shaped Method INEN 698 - Large-Scale Stochastic Optimization Semester project Svyatoslav Trukhanov December 13, 2005 1 Contents 1 Introduction and Literature Review 3 2 Formal
More informationParticle Swarm Optimization and Differential Evolution Methods Hybridized with Pattern Search for Solving Optimization Problems
Particle Swarm Optimization and Differential Evolution Methods Hybridized with Pattern Search for Solving Optimization Problems Viviane J. Galvão 1, Helio J. C. Barbosa 1,2, and Heder S. Bernardino 2 1
More informationMOSEK Optimization Suite
MOSEK Optimization Suite Release 8.1.0.72 MOSEK ApS 2018 CONTENTS 1 Overview 1 2 Interfaces 5 3 Remote optimization 11 4 Contact Information 13 i ii CHAPTER ONE OVERVIEW The problem minimize 1x 1 + 2x
More informationProgramming, numerics and optimization
Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June
More informationResearch Interests Optimization:
Mitchell: Research interests 1 Research Interests Optimization: looking for the best solution from among a number of candidates. Prototypical optimization problem: min f(x) subject to g(x) 0 x X IR n Here,
More informationPARETO FRONT APPROXIMATION WITH ADAPTIVE WEIGHTED SUM METHOD IN MULTIOBJECTIVE SIMULATION OPTIMIZATION. Jong-hyun Ryu Sujin Kim Hong Wan
Proceedings of the 009 Winter Simulation Conference M. D. Rossetti, R. R. Hill, B. Johansson, A. Dunkin, and R. G. Ingalls, eds. PARETO FRONT APPROXIMATION WITH ADAPTIVE WEIGHTED SUM METHOD IN MULTIOBJECTIVE
More informationModern Benders (in a nutshell)
Modern Benders (in a nutshell) Matteo Fischetti, University of Padova (based on joint work with Ivana Ljubic and Markus Sinnl) Lunteren Conference on the Mathematics of Operations Research, January 17,
More informationBuilding Simulation Software for the Next Decade: Trends and Tools
Building Simulation Software for the Next Decade: Trends and Tools Hans Petter Langtangen Center for Biomedical Computing (CBC) at Simula Research Laboratory Dept. of Informatics, University of Oslo September
More informationAn extended supporting hyperplane algorithm for convex MINLP problems
An extended supporting hyperplane algorithm for convex MINLP problems Jan Kronqvist, Andreas Lundell and Tapio Westerlund Center of Excellence in Optimization and Systems Engineering Åbo Akademi University,
More informationMetaheuristic Optimization with Evolver, Genocop and OptQuest
Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:
More informationStochastic branch & bound applying. target oriented branch & bound method to. optimal scenario tree reduction
Stochastic branch & bound applying target oriented branch & bound method to optimal scenario tree reduction Volker Stix Vienna University of Economics Department of Information Business Augasse 2 6 A-1090
More informationPerceptron: This is convolution!
Perceptron: This is convolution! v v v Shared weights v Filter = local perceptron. Also called kernel. By pooling responses at different locations, we gain robustness to the exact spatial location of image
More informationReduced Order Models for Oxycombustion Boiler Optimization
Reduced Order Models for Oxycombustion Boiler Optimization John Eason Lorenz T. Biegler 9 March 2014 Project Objective Develop an equation oriented framework to optimize a coal oxycombustion flowsheet.
More informationSuperdiffusion and Lévy Flights. A Particle Transport Monte Carlo Simulation Code
Superdiffusion and Lévy Flights A Particle Transport Monte Carlo Simulation Code Eduardo J. Nunes-Pereira Centro de Física Escola de Ciências Universidade do Minho Page 1 of 49 ANOMALOUS TRANSPORT Definitions
More informationA PACKAGE FOR DEVELOPMENT OF ALGORITHMS FOR GLOBAL OPTIMIZATION 1
Mathematical Modelling and Analysis 2005. Pages 185 190 Proceedings of the 10 th International Conference MMA2005&CMAM2, Trakai c 2005 Technika ISBN 9986-05-924-0 A PACKAGE FOR DEVELOPMENT OF ALGORITHMS
More informationTeaching numerical methods : a first experience. Ronojoy Adhikari The Institute of Mathematical Sciences Chennai.
Teaching numerical methods : a first experience Ronojoy Adhikari The Institute of Mathematical Sciences Chennai. Numerical Methods : the first class 14 students, spread across second and third years of
More informationIntroduction to ANSYS DesignXplorer
Lecture 5 Goal Driven Optimization 14. 5 Release Introduction to ANSYS DesignXplorer 1 2013 ANSYS, Inc. September 27, 2013 Goal Driven Optimization (GDO) Goal Driven Optimization (GDO) is a multi objective
More informationPattern Recognition Lecture Sequential Clustering
Pattern Recognition Lecture Prof. Dr. Marcin Grzegorzek Research Group for Pattern Recognition Institute for Vision and Graphics University of Siegen, Germany Pattern Recognition Chain patterns sensor
More informationMultidisciplinary Analysis and Optimization
OptiY Multidisciplinary Analysis and Optimization Process Integration OptiY is an open and multidisciplinary design environment, which provides direct and generic interfaces to many CAD/CAE-systems and
More information4.12 Generalization. In back-propagation learning, as many training examples as possible are typically used.
1 4.12 Generalization In back-propagation learning, as many training examples as possible are typically used. It is hoped that the network so designed generalizes well. A network generalizes well when
More informationComputational Economics and Finance
Computational Economics and Finance Part I: Elementary Concepts of Numerical Analysis Spring 2015 Outline Computer arithmetic Error analysis: Sources of error Error propagation Controlling the error Rates
More informationFrom Theory to Application (Optimization and Optimal Control in Space Applications)
From Theory to Application (Optimization and Optimal Control in Space Applications) Christof Büskens Optimierung & Optimale Steuerung 02.02.2012 The paradox of mathematics If mathematics refer to reality,
More informationCOPT: A C++ Open Optimization Library
COPT: A C++ Open Optimization Library {Zhouwang Yang, Ruimin Wang}@MathU School of Mathematical Science University of Science and Technology of China Zhouwang Yang Ruimin Wang University of Science and
More informationThe Mathematics Behind Neural Networks
The Mathematics Behind Neural Networks Pattern Recognition and Machine Learning by Christopher M. Bishop Student: Shivam Agrawal Mentor: Nathaniel Monson Courtesy of xkcd.com The Black Box Training the
More informationAIR FORCE INSTITUTE OF TECHNOLOGY
QUANTITATIVE OBJECT RECONSTRUCTION USING ABEL TRANSFORM TOMOGRAPHY AND MIXED VARIABLE OPTIMIZATION THESIS Kevin Robert O Reilly, Second Lieutenant, USAF AFIT/GSS/ENC/06-1 DEPARTMENT OF THE AIR FORCE AIR
More informationTopology Optimization and JuMP
Immense Potential and Challenges School of Engineering and Information Technology UNSW Canberra June 28, 2018 Introduction About Me First year PhD student at UNSW Canberra Multidisciplinary design optimization
More informationFundamentals of Integer Programming
Fundamentals of Integer Programming Di Yuan Department of Information Technology, Uppsala University January 2018 Outline Definition of integer programming Formulating some classical problems with integer
More informationAdventures in Load Balancing at Scale: Successes, Fizzles, and Next Steps
Adventures in Load Balancing at Scale: Successes, Fizzles, and Next Steps Rusty Lusk Mathematics and Computer Science Division Argonne National Laboratory Outline Introduction Two abstract programming
More informationCOMP9334: Capacity Planning of Computer Systems and Networks
COMP9334: Capacity Planning of Computer Systems and Networks Week 10: Optimisation (1) A/Prof Chun Tung Chou CSE, UNSW COMP9334, Chun Tung Chou, 2016 Three Weeks of Optimisation The lectures for these
More informationME964 High Performance Computing for Engineering Applications
ME964 High Performance Computing for Engineering Applications Outlining Midterm Projects Topic 3: GPU-based FEA Topic 4: GPU Direct Solver for Sparse Linear Algebra March 01, 2011 Dan Negrut, 2011 ME964
More informationTMA946/MAN280 APPLIED OPTIMIZATION. Exam instructions
Chalmers/GU Mathematics EXAM TMA946/MAN280 APPLIED OPTIMIZATION Date: 03 05 28 Time: House V, morning Aids: Text memory-less calculator Number of questions: 7; passed on one question requires 2 points
More informationMultidisciplinary System Design Optimization (MSDO) Course Summary
Multidisciplinary System Design Optimization (MSDO) Course Summary Lecture 23 Prof. Olivier de Weck Prof. Karen Willcox 1 Outline Summarize course content Present some emerging research directions Interactive
More informationDeveloping Optimization Algorithms for Real-World Applications
Developing Optimization Algorithms for Real-World Applications Gautam Ponnappa PC Training Engineer Viju Ravichandran, PhD Education Technical Evangelist 2015 The MathWorks, Inc. 1 2 For a given system,
More informationNodal Basis Functions for Serendipity Finite Elements
Nodal Basis Functions for Serendipity Finite Elements Andrew Gillette Department of Mathematics University of Arizona joint work with Michael Floater (University of Oslo) Andrew Gillette - U. Arizona Nodal
More informationConvex Optimization. Lijun Zhang Modification of
Convex Optimization Lijun Zhang zlj@nju.edu.cn http://cs.nju.edu.cn/zlj Modification of http://stanford.edu/~boyd/cvxbook/bv_cvxslides.pdf Outline Introduction Convex Sets & Functions Convex Optimization
More informationIBM ILOG CPLEX Optimization Studio Getting Started with CPLEX for MATLAB. Version 12 Release 6
IBM ILOG CPLEX Optimization Studio Getting Started with CPLEX for MATLAB Version 12 Release 6 Copyright notice Describes general use restrictions and trademarks related to this document and the software
More informationStochastic Separable Mixed-Integer Nonlinear Programming via Nonconvex Generalized Benders Decomposition
Stochastic Separable Mixed-Integer Nonlinear Programming via Nonconvex Generalized Benders Decomposition Xiang Li Process Systems Engineering Laboratory Department of Chemical Engineering Massachusetts
More informationLS-OPT Current development: A perspective on multi-level optimization, MOO and classification methods
LS-OPT Current development: A perspective on multi-level optimization, MOO and classification methods Nielen Stander, Anirban Basudhar, Imtiaz Gandikota LSTC, Livermore, CA LS-DYNA Developers Forum, Gothenburg,
More informationToday. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient
Optimization Last time Root finding: definition, motivation Algorithms: Bisection, false position, secant, Newton-Raphson Convergence & tradeoffs Example applications of Newton s method Root finding in
More informationINTERIOR POINT METHOD BASED CONTACT ALGORITHM FOR STRUCTURAL ANALYSIS OF ELECTRONIC DEVICE MODELS
11th World Congress on Computational Mechanics (WCCM XI) 5th European Conference on Computational Mechanics (ECCM V) 6th European Conference on Computational Fluid Dynamics (ECFD VI) E. Oñate, J. Oliver
More informationNAG at Manchester. Michael Croucher (University of Manchester)
NAG at Manchester Michael Croucher (University of Manchester) Michael.Croucher@manchester.ac.uk www.walkingrandomly.com Twitter: @walkingrandomly My background PhD Computational Physics from Sheffield
More informationTopological Machining Fixture Layout Synthesis Using Genetic Algorithms
Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Necmettin Kaya Uludag University, Mechanical Eng. Department, Bursa, Turkey Ferruh Öztürk Uludag University, Mechanical Eng. Department,
More informationWebinar Parameter Identification with optislang. Dynardo GmbH
Webinar Parameter Identification with optislang Dynardo GmbH 1 Outline Theoretical background Process Integration Sensitivity analysis Least squares minimization Example: Identification of material parameters
More informationAn augmented Lagrangian method for equality constrained optimization with fast infeasibility detection
An augmented Lagrangian method for equality constrained optimization with fast infeasibility detection Paul Armand 1 Ngoc Nguyen Tran 2 Institut de Recherche XLIM Université de Limoges Journées annuelles
More informationUsing Adaptive Sparse Grids to Solve High-Dimensional Dynamic Models. J. Brumm & S. Scheidegger BFI, University of Chicago, Nov.
Using Adaptive Sparse Grids to Solve High-Dimensional Dynamic Models J. Brumm & S. Scheidegger, Nov. 1st 2013 Outline I.) From Full (Cartesian) Grids to Sparse Grids II.) Adaptive Sparse Grids III.) Time
More information