A Brief Overview of Optimization Problems. Steven G. Johnson MIT course , Fall 2008

Similar documents
A Brief Overview of Optimization Problems. Steven G. Johnson MIT course , Fall 2008

25. NLP algorithms. ˆ Overview. ˆ Local methods. ˆ Constrained optimization. ˆ Global methods. ˆ Black-box methods.

LECTURE NOTES Non-Linear Programming

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Lecture 12: Feasible direction methods

Optimization Methods. Final Examination. 1. There are 5 problems each w i t h 20 p o i n ts for a maximum of 100 points.

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs

Convex Optimization. Lijun Zhang Modification of

Introduction to Optimization

Computational Methods. Constrained Optimization

CMU-Q Lecture 9: Optimization II: Constrained,Unconstrained Optimization Convex optimization. Teacher: Gianni A. Di Caro

Introduction to optimization methods and line search

Introduction to CVX. Olivier Fercoq and Rachael Tappenden. 24 June pm, ICMS Newhaven Lecture Theatre

Introduction to Modern Control Systems

Characterizing Improving Directions Unconstrained Optimization

Optimization under uncertainty: modeling and solution methods

Nonlinear Programming

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Convex Optimization MLSS 2015

Optimization. Industrial AI Lab.

Constrained Optimization COS 323

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

15. Cutting plane and ellipsoid methods

APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING

Introduction to Optimization Problems and Methods

Research Interests Optimization:

Mathematical Programming and Research Methods (Part II)

Lecture 12: convergence. Derivative (one variable)

Lec13p1, ORF363/COS323

Finding optimal configurations Adversarial search

Applied Lagrange Duality for Constrained Optimization

Programming, numerics and optimization

Agenda. Understanding advanced modeling techniques takes some time and experience No exercises today Ask questions!

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Lecture 1: Introduction

Introduction to Linear Programming. Algorithmic and Geometric Foundations of Optimization

Solution Methods Numerical Algorithms

Optimieren mit MATLAB jetzt auch gemischt-ganzzahlig Dr. Maka Karalashvili Application Engineer MathWorks

EE/AA 578: Convex Optimization

Introduction to optimization

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization

CasADi tutorial Introduction

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

IE521 Convex Optimization Introduction

Optimization. 1. Optimization. by Prof. Seungchul Lee Industrial AI Lab POSTECH. Table of Contents

Computational Optimization. Constrained Optimization Algorithms

Financial Optimization ISE 347/447. Lecture 13. Dr. Ted Ralphs

Numerical Optimization: Introduction and gradient-based methods

Convex Functions & Optimization

California Institute of Technology Crash-Course on Convex Optimization Fall Ec 133 Guilherme Freitas

Modern Methods of Data Analysis - WS 07/08

Convex Optimization. Erick Delage, and Ashutosh Saxena. October 20, (a) (b) (c)

Classification of Optimization Problems and the Place of Calculus of Variations in it

Ellipsoid Algorithm :Algorithms in the Real World. Ellipsoid Algorithm. Reduction from general case

CO367/CM442 Nonlinear Optimization Lecture 1

CS 435, 2018 Lecture 2, Date: 1 March 2018 Instructor: Nisheeth Vishnoi. Convex Programming and Efficiency

Convex Optimization CMU-10725

IE598 Big Data Optimization Summary Nonconvex Optimization

COMS 4771 Support Vector Machines. Nakul Verma

Introduction to Convex Optimization. Prof. Daniel P. Palomar

5 Machine Learning Abstractions and Numerical Optimization

Constrained optimization

Operations Research and Optimization: A Primer

TMA946/MAN280 APPLIED OPTIMIZATION. Exam instructions

Algorithms for convex optimization

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

2. Convex sets. affine and convex sets. some important examples. operations that preserve convexity. generalized inequalities

Affine function. suppose f : R n R m is affine (f(x) =Ax + b with A R m n, b R m ) the image of a convex set under f is convex

Lecture 5: Properties of convex sets

Disciplined Convex Programming and CVX

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP):

Simulated Annealing Method for Regional Analysis

Support Vector Machines. James McInerney Adapted from slides by Nakul Verma

Short Reminder of Nonlinear Programming

16.410/413 Principles of Autonomy and Decision Making

Convex Optimization. Chapter 1 - chapter 2.2

Lecture 25 Nonlinear Programming. November 9, 2009

A Derivative-Free Approximate Gradient Sampling Algorithm for Finite Minimax Problems

Tutorial on Convex Optimization for Engineers

Conic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding

INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING

MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms

Optimization Methods in Management Science

Machine Learning for Software Engineering

CS281 Section 3: Practical Optimization

3.6.2 Generating admissible heuristics from relaxed problems

Constrained Optimization with Calculus. Background Three Big Problems Setup and Vocabulary

Convexity I: Sets and Functions

Generic descent algorithm Generalization to multiple dimensions Problems of descent methods, possible improvements Fixes Local minima

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 1: Introduction to Optimization. Instructor: Shaddin Dughmi

Local and Global Minimum

Efficient Optimization for L -problems using Pseudoconvexity

MATH3016: OPTIMIZATION

Natural Language Processing

Lecture 2 September 3

Chapter 3 Numerical Methods

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

BACK TO MATH LET S START EASY.

arxiv: v2 [math.oc] 12 Feb 2019

Transcription:

A Brief Overview of Optimization Problems Steven G. Johnson MIT course 18.335, Fall 2008

Why optimization? In some sense, all engineering design is optimization: choosing design parameters to improve some objective Much of data analysis is also optimization: extracting some model parameters from data while minimizing some error measure (e.g. fitting) Most business decisions = optimization: varying some decision parameters to maximize profit (e.g. investment portfolios, supply chains, etc.)

A general optimization problem min x n f 0 (x) subject to m constraints f i (x) 0 i = 1, 2,, m x is a feasible point if it satisfies all the constraints feasible region = set of all feasible x minimize an objective function f 0 with respect to n design parameters x (also called decision parameters, optimization variables, etc.) note that maximizing g(x) corresponds to f 0 (x) = g(x) note that an equality constraint h(x) = 0 yields two inequality constraints f i (x) = h(x) and f i+1 (x) = h(x) (although, in practical algorithms, equality constraints typically require special handling)

Important considerations Global versus local optimization Convex vs. non-convex optimization Unconstrained or box-constrained optimization, and other special-case constraints Special classes of functions (linear, etc.) Differentiable vs. non-differentiable functions Gradient-based vs. derivative-free algorithms Zillions of different algorithms, usually restricted to various special cases, each with strengths/weaknesses

Global vs. Local Optimization For general nonlinear functions, most algorithms only guarantee a local optimum that is, a feasible x o such that f 0 (x o ) f 0 (x) for all feasible x within some neighborhood x x o < R (for some small R) A much harder problem is to find a global optimum: the minimum of f 0 for all feasible x exponentially increasing difficulty with increasing n, practically impossible to guarantee that you have found global minimum without knowing some special property of f 0 many available algorithms, problem-dependent efficiencies not just genetic algorithms or simulated annealing (which are popular, easy to implement, and thought-provoking, but usually very slow!) for example, non-random systematic search algorithms (e.g. DIRECT), partially randomized searches (e.g. CRS2), repeated local searches from different starting points ( multistart algorithms, e.g. MLSL),

Convex Optimization [ good reference: Convex Optimization by Boyd and Vandenberghe, free online at www.stanford.edu/~boyd/cvxbook ] All the functions f i (i=0 m) are convex: f i ( x + y) f i (x) + f i (y) where + = 1, [0,1] convex: f(x) not convex: f(x) f(x) + f(y) x f( x+ y) y x y For a convex problem (convex objective & constraints) any local optimum must be a global optimum efficient, robust solution methods available

Important Convex Problems LP (linear programming): the objective and constraints are affine: f i (x) = a i T x + i QP (quadratic programming): affine constraints + convexquadratic objective x T Ax+b T x SOCP (second-order cone program): LP + cone constraints Ax+b 2 a T x + SDP (semidefinite programming): constraints are that A k x k is positive-semidefinite all of these have very efficient, specialized solution methods

Important special constraints Simplest case is the unconstrained optimization problem: m=0 e.g., line-search methods like steepest-descent, nonlinear conjugate gradients, Newton methods Next-simplest are box constraints (also called bound constraints): x min k x k x max k easily incorporated into line-search methods and many other algorithms many algorithms/software only handle box constraints Linear equality constraints Ax=b for example, can be explicitly eliminated from the problem by writing x=ny+, where is a solution to A =b and N is a basis for the nullspace of A

Derivatives of f i Most-efficient algorithms typically require user to supply the gradients x f i of objective/constraints you should always compute these analytically rather than use finite-difference approximations, better to just use a derivative-free optimization algorithm in principle, one can always compute x f i with about the same cost as f i, using adjoint methods gradient-based methods can find (local) optima of problems with millions of design parameters Derivative-free methods: only require f i values easier to use, can work with complicated black-box functions where computing gradients is inconvenient may be only possibility for nondifferentiable problems need > n function evaluations, bad for large n

Removable non-differentiability consider the non-differentiable unconstrained problem: min f 0 (x) x n equivalent to minimax problem: min(max{ f 0 (x), f 0 (x)}) x n still nondifferentiable f 0 (x) optimum f 0 (x) x equivalent to constrained problem with a temporary variable t: min t subject to: t f 0 (x) ( f 1 (x) = f 0 (x) t) x n, t t f 0 (x) ( f 2 (x) = f 0 (x) t )

Example: Chebyshev linear fitting find the fit that minimizes the maximum error: min(max x 1 a i + x 2 b ) i x 1, x 2 i nondifferentiable minimax problem b fit line ax 1 +x 2 N points (a i,b i ) a equivalent to a linear programming problem (LP): min t subject to 2N constraints: x 1, x 2, t x 1 a i + x 2 b i t 0 b i x 1 a i x 2 t 0

Relaxations of Integer Programming If x is integer-valued rather than real-valued (e.g. x {0,1} n ), the resulting integer programming or combinatorial optimization problem becomes much harder in general. However, useful results can often be obtained by a continuous relaxation of the problem e.g., going from x {0,1} n to x [0,1] n at the very least, this gives an lower bound on the optimum f 0

Example: Topology Optimization design a structure to do something, made of material A or B let every pixel of discretized structure vary continuously from A to B density of each pixel varies continuously from 0 (air) to max force Object removed due to copyright restrictions. ex: design a cantilever to support maximum weight with a fixed amount of material optimized structure, deformed under load [ Buhl et al, Struct. Multidisc. Optim. 19, 93 104 (2000) ]

Some Sources of Software Decision tree for optimization software: http://plato.asu.edu/guide.html lists many packages for many problems CVX: general convex-optimization package http://www.stanford.edu/~boyd/cvx NLopt: implements many nonlinear optimization algorithms (global/local, constrained/unconstrained, derivative/no-derivative) http://ab-initio.mit.edu/nlopt

MIT OpenCourseWare http://ocw.mit.edu 18.335J / 6.337J Introduction to Numerical Methods Fall 2010 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.