Constrained and Unconstrained Optimization

Size: px
Start display at page:

Download "Constrained and Unconstrained Optimization"

Transcription

1 Constrained and Unconstrained Optimization Carlos Hurtado Department of Economics University of Illinois at Urbana-Champaign Oct 10th, 2017 C. Hurtado (UIUC - Economics) Numerical Methods

2 On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function 3 Golden Search 4 Newton s Method 5 Polytope Method 6 Newton s Method Reloaded 7 Quasi-Newton Methods 8 Non-linear Least-Square 9 Constrained Optimization C. Hurtado (UIUC - Economics) Numerical Methods

3 Numerical Optimization On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function 3 Golden Search 4 Newton s Method 5 Polytope Method 6 Newton s Method Reloaded 7 Quasi-Newton Methods 8 Non-linear Least-Square 9 Constrained Optimization C. Hurtado (UIUC - Economics) Numerical Methods

4 Numerical Optimization Numerical Optimization In some economic problems, we would like to find the value that maximizes or minimizes a function. We are going to focus on the minimization problems: or min f (x) x min f (x) s.t. x B x Notice that minimization and maximization are equivalent because we can maximize f (x) by minimizing f (x). C. Hurtado (UIUC - Economics) Numerical Methods 1 / 27

5 Numerical Optimization Numerical Optimization We want to solve this problem in a reasonable time Most often, the CPU time is dominated by the cost of evaluating f (x). We will like to keep the number of evaluations of f (x) as small as possible. There are two types of objectives: - Finding global minimum: The lowest possible value of the function over the range. - Finding a local minimum: Smallest value within a bounded neighborhood. C. Hurtado (UIUC - Economics) Numerical Methods 2 / 27

6 Minimization of Scalar Function On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function 3 Golden Search 4 Newton s Method 5 Polytope Method 6 Newton s Method Reloaded 7 Quasi-Newton Methods 8 Non-linear Least-Square 9 Constrained Optimization C. Hurtado (UIUC - Economics) Numerical Methods

7 Minimization of Scalar Function Bracketing Method We would like to find the minimum of a scalar funciton f (x), such that f : R R. The Bracketing method is a direct method that does not use curvature or local approximation We start with a bracket: (a, b, c) s.t. a < b < c and f (a) > f (b) and f (c) > f (b) We will search for the minimum by selecting a trial point in one of the intervals. If c b > b a, take d = b+c 2. Else, if c b b a, take d = a+b 2 If f (d) > f (b), there is a new bracket (d, b, c) or (a, b, d). If f (d) < f (b), there is a new bracket (a, d, c). Continue until the distance between the extremes of the bracket is small. C. Hurtado (UIUC - Economics) Numerical Methods 3 / 27

8 Minimization of Scalar Function Bracketing Method We selected the new point using the mid point between the extremes, but what is the best location for the new point d? a b d c One possibility is to minimize the size of the next search interval. The next search interval will be either from a to d or from b to c The proportion of the left interval is w = b a c a The proportion of the new interval is z = d b c a C. Hurtado (UIUC - Economics) Numerical Methods 4 / 27

9 Golden Search On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function 3 Golden Search 4 Newton s Method 5 Polytope Method 6 Newton s Method Reloaded 7 Quasi-Newton Methods 8 Non-linear Least-Square 9 Constrained Optimization C. Hurtado (UIUC - Economics) Numerical Methods

10 Golden Search Golden Search The proportion of the new segment will be 1 w = c b c a or w + z = d a c a Moreover, if d is the new candidate to minimize the function, Ideally we will have and d b z 1 w = c a c b c a z = 1 2w z 1 w = w = d b c b C. Hurtado (UIUC - Economics) Numerical Methods 5 / 27

11 Golden Search Golden Search The previous equations imply w 2 3w + 1 = 0, or w = In mathematics, the golden ration is φ = This goes back to Pythagoras Notice that 1 1 φ = The Golden Search algorithm uses the golden ratio to set the new point (using a weighed average) This reduces the bracketing by about 40%. The performance is independent of the function that is being minimized. C. Hurtado (UIUC - Economics) Numerical Methods 6 / 27

12 Golden Search Golden Search Sometimes the performance can be improve substantially when a local approximation is used. When we use a combination of local approximation and golden search we get a method called Brent. Let us suppose that we want to minimize y = x(x 2)(x + 2) 2 C. Hurtado (UIUC - Economics) Numerical Methods 7 / 27

13 Golden Search Golden Search Sometimes the performance can be improve substantially when a local approximation is used. When we use a combination of local approximation and golden search we get a method called Brent. Let us suppose that we want to minimize y = x(x 2)(x + 2) y = x(x - 2)(x + 2) x C. Hurtado (UIUC - Economics) Numerical Methods 7 / 27

14 Golden Search Golden Search We can use the minimize scalar function from the scipy.optimize module. 1 >>> def f( x): 2 >>>... return (x - 2) * x * (x + 2) **2 3 >>> from scipy. optimize import minimize_ scalar 4 >>> opt_res = minimize_ scalar ( f) 5 >>> print opt_res. x >>> opt_res = minimize_scalar (f, method = golden ) 8 >>> print opt_res. x >>> opt_res = minimize_scalar (f, bounds =( -3, -1), method = bounded ) 11 >>> print opt_res. x C. Hurtado (UIUC - Economics) Numerical Methods 8 / 27

15 Newton s Method On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function 3 Golden Search 4 Newton s Method 5 Polytope Method 6 Newton s Method Reloaded 7 Quasi-Newton Methods 8 Non-linear Least-Square 9 Constrained Optimization C. Hurtado (UIUC - Economics) Numerical Methods

16 Newton s Method Newton s Method Let us assume that the function f (x) : R R is infinitely differentiable We would like to find x such that f (x ) f (x) for all x R. Idea: Use a Taylor approximation of the function f (x). The polynomial approximation of order two around a is: p(x) = f (a) + f (a)(x a) f (a)(x a) 2 To find an optimal value for p(x) we use the FOC: p (x) = f (a) + (x a)f (a) = 0 Hence, x = a f (a) f (a) C. Hurtado (UIUC - Economics) Numerical Methods 9 / 27

17 Newton s Method Newton s Method The Newton s method starts with a given x 1. To compute the next candidate to minimize the function we use x n+1 = x n f (x n ) f (x n ) Do this until and x n+1 x n < ε f (x n+1 ) < ɛ Newton s method is very fast (quadratic convergence). Theorem: x n+1 x n < f (x ) 2 f (x ) x n x 2 C. Hurtado (UIUC - Economics) Numerical Methods 10 / 27

18 Newton s Method Newton s Method A Quick Detour: Root Finding Consider the problem of finding zeros for p(x) Assume that you know a point a where p(a) is positive and a point b where p(b) is negative. If p(x) is continuous between a and b, we could approximate as: p(x) p(a) + (x a)p (a) The approximate zero is then: x = a p(a) p (a) The idea is the same as before. Newton s method also works for finding roots. C. Hurtado (UIUC - Economics) Numerical Methods 11 / 27

19 Newton s Method Newton s Method There are several issues with the Newton s method: - Iteration point is stationary - Starting point enter a cycle - Derivative does not exist - Discontinuous derivative Newton s method finds a local optimum, but not a global optimum. C. Hurtado (UIUC - Economics) Numerical Methods 12 / 27

20 Polytope Method On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function 3 Golden Search 4 Newton s Method 5 Polytope Method 6 Newton s Method Reloaded 7 Quasi-Newton Methods 8 Non-linear Least-Square 9 Constrained Optimization C. Hurtado (UIUC - Economics) Numerical Methods

21 Polytope Method Polytope Method The Polytope (a.k.a. Nelder-Meade) Method is a direct method to find the solution of min f (x) x where f : R n R. We start with the points x 1, x 2 and x 3, such that f (x 1 ) f (x 2 ) f (x 3 ) Using the midpoint between x 2 and x 3, we reflect x 1 to the point y 1 Check if f (y 1 ) < f (x 1 ). If true, you have a new polytope. If not, try x 2. If not, try x 3 If nothing works, shrink the polytope toward x 3. Stop when the size of the polytope is smaller then ε C. Hurtado (UIUC - Economics) Numerical Methods 13 / 27

22 Polytope Method Polytope Method Let us consider the following function: f (x 0, x 1 ) = (1 x 0 ) (x 1 x 2 0 ) 2 The function looks like: C. Hurtado (UIUC - Economics) Numerical Methods 14 / 27

23 Polytope Method Polytope Method Let us consider the following function: The function looks like: f (x 0, x 1 ) = (1 x 0 ) (x 1 x 2 0 ) y 1500 y x x x x C. Hurtado (UIUC - Economics) Numerical Methods 14 / 27

24 Polytope Method Polytope Method Let us consider the following function: The function looks like: f (x 0, x 1 ) = (1 x 0 ) (x 1 x 2 0 ) C. Hurtado (UIUC - Economics) Numerical Methods 14 / 27

25 Polytope Method Polytope Method In python we can do: 1 >>> def f2(x): 2... return (1 -x [0]) ** *( x[1] -x [0]**2) **2 3 >>> from scipy. optimize import fmin 4 >>> opt = fmin ( func =f2,x0 =[0,0]) 5 >>> print ( opt ) 6 [ ] C. Hurtado (UIUC - Economics) Numerical Methods 15 / 27

26 Newton s Method Reloaded On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function 3 Golden Search 4 Newton s Method 5 Polytope Method 6 Newton s Method Reloaded 7 Quasi-Newton Methods 8 Non-linear Least-Square 9 Constrained Optimization C. Hurtado (UIUC - Economics) Numerical Methods

27 Newton s Method Reloaded Newton s Method What can we do if we want to use Newton s Method for a function f : R n R? We can use a quadratic approximation at a = (a 1,, a n ): p(x) = f (a) + f (a)(x a) (x a) H(a)(x a) where x = (x 1,, x n ). The gradient f (x) is( a multi-variable generalization of the derivative: f (x) = f (x),, x 1 ) f (x) x n C. Hurtado (UIUC - Economics) Numerical Methods 16 / 27

28 Newton s Method Reloaded Newton s Method The hessian matrix H(x) is a square matrix of second-order partial derivatives that describes the local curvature of a function of many variables. H(x) = 2 f (x) x1 2 2 f (x) x 2 x 1. 2 f (x) x n x 1 2 f (x) x 1 x 2 2 f (x) x 1 x n 2 f (x) 2 f (x) x2 2 x 2 x n f (x) x n x 2 2 f (x) xn 2 The FOC is: p = f (a) + H(a)(x a) = 0 We can solve this to get: x = a H(a) 1 f (a) C. Hurtado (UIUC - Economics) Numerical Methods 17 / 27

29 Newton s Method Reloaded Newton s Method Following the same logic as in the one dimensional case: x k+1 = x k H(x k ) 1 f (x k ) How do we compute H(x k ) 1 f (x k )? We can solve: H(x k ) 1 f (x k ) = s f (x k ) = H(x k )s The search direction, s, is the solution of a system of equations (and we know how to solve that!) C. Hurtado (UIUC - Economics) Numerical Methods 18 / 27

30 Quasi-Newton Methods On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function 3 Golden Search 4 Newton s Method 5 Polytope Method 6 Newton s Method Reloaded 7 Quasi-Newton Methods 8 Non-linear Least-Square 9 Constrained Optimization C. Hurtado (UIUC - Economics) Numerical Methods

31 Quasi-Newton Methods Quasi-Newton Methods For Newton s method we need the Hessian of the function. If the Hessian is unavailable, the full Newton s method cannot be used Any method that replaces the Hessian with an approximation is a quasi-newton method. One advantage of quasi-newton methods is that the Hessian matrix does not need to be inverted. Newton s method, require the Hessian to be inverted, which is typically implemented by solving a system of equations Quasi-Newton methods usually generate an estimate of the inverse directly. C. Hurtado (UIUC - Economics) Numerical Methods 19 / 27

32 Quasi-Newton Methods Quasi-Newton Methods The Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm, the Hessian matrix is approximated using updates specified by gradient evaluations (or approximate gradient evaluations). In python: 1 >>> import numpy as np 2 >>> from scipy. optimize import fmin_ bfgs 3 >>> def f( x): 4... return (1 -x [0]) ** *( x[1] -x [0]**2) **2 5 >>> opt = fmin_ bfgs (f, x0 =[0.5,0.5]) Using the gradient we can improve the approximation 1 >>> d e f g r a d i e n t ( x ) : 2... r e t u r n np. a r r a y (( 2 (1 x [ 0 ] ) x [ 0 ] ( x [ 1 ] x [ 0 ] 2 ), 200 ( x [ 1 ] x [ 0 ] 2 ) ) ) 3 >>> opt2 = fmin bfgs ( f, x0 = [10,10], fprime=gradient ) C. Hurtado (UIUC - Economics) Numerical Methods 20 / 27

33 Quasi-Newton Methods Quasi-Newton Methods One of the methods that requires the fewest function calls (therefore very fast) is the Newton-Conjugate-Gradient (NCG). The method uses a conjugate gradient algorithm to (approximately) invert the local Hessian. If the Hessian is positive definite then the local minimum of this function can be found by setting the gradient of the quadratic form to zero In python 1 >>> from scipy. optimize import fmin_ ncg 2 >>> opt3 = fmin_ncg (f,x0 =[10,10], fprime = gradient ) C. Hurtado (UIUC - Economics) Numerical Methods 21 / 27

34 Non-linear Least-Square On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function 3 Golden Search 4 Newton s Method 5 Polytope Method 6 Newton s Method Reloaded 7 Quasi-Newton Methods 8 Non-linear Least-Square 9 Constrained Optimization C. Hurtado (UIUC - Economics) Numerical Methods

35 Non-linear Least-Square Non-linear Least-Square suppose it is desired to fit a set of data {x i, y i } to a model, y = f (x; p) where p is a vector of parameters for the model that need to be found. A common method for determining which parameter vector gives the best fit to the data is to minimize the sum of squares errors. (why?) The error is usually defined for each observed data-point as: e i (y i, x i ; p) = y i f (x i ; p) The sum of the square of the errors is: N S (p; x, y) = ei 2 (y i, x i ; p) i=1 C. Hurtado (UIUC - Economics) Numerical Methods 22 / 27

36 Non-linear Least-Square Non-linear Least-Square Suppose that we model some populaton data at several times. y i = f (t i ; (A, b)) = Ae bt The parameters A and b are unknown to the economist. We would like to minimize the square of the error to approximate the data C. Hurtado (UIUC - Economics) Numerical Methods 23 / 27

37 Non-linear Least-Square Non-linear Least-Square Suppose that we model some populaton data at several times. y i = f (t i ; (A, b)) = Ae bt The parameters A and b are unknown to the economist. We would like to minimize the square of the error to approximate the data C. Hurtado (UIUC - Economics) Numerical Methods 23 / 27

38 Constrained Optimization On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function 3 Golden Search 4 Newton s Method 5 Polytope Method 6 Newton s Method Reloaded 7 Quasi-Newton Methods 8 Non-linear Least-Square 9 Constrained Optimization C. Hurtado (UIUC - Economics) Numerical Methods

39 Constrained Optimization Constrained Optimization Let us find the minimum of a scalar function subject to constrains. min f (x) s.t. g(x) = a and h(x) b x Rn Here we have g : R n R m and h : R n R k. Notice that we can re-write the problem as an unconstrained version: min f (x) + 1 x R n 2 p [ m ] (g i (x) a i ) 2 + i=1 k max {0, h j (x) b j } j=1 For a very large value of p, the constrain needs to be satisfied (penalty method). C. Hurtado (UIUC - Economics) Numerical Methods 24 / 27

40 Constrained Optimization Constrained Optimization If the objective function is quadratic, the optimization problem looks like min x R n q(x) = 1 2 x Gx + x c s.t. g(x) = a and h(x) b The structure of this type of problems can be efficiently exploited. This form the basis for Augmented Lagrangian and Sequential Quadratic Programming problems C. Hurtado (UIUC - Economics) Numerical Methods 25 / 27

41 Constrained Optimization Constrained Optimization The Augmented Lagrangian Methods use a mix of the Lagrangian with penalty method. The Sequential Quadratic Programming Algorithms (SQPA) solve the problem by using Quadratic approximations of the Lagrangean function. The SQPA is the analogous of Newton s method for the case of constraints. How does the algorithm solve the problem? It is possible with extensions of simplex method, which we will not cover. The previous extensions can be solved with the BFGS algorithm C. Hurtado (UIUC - Economics) Numerical Methods 26 / 27

42 Constrained Optimization Constrained Optimization Let us consider the Utility Maximization problem of an agent with constant elasticity of substitution (CES) utility function: U(x, y) = (αx ρ + (1 α) y ρ ) 1 ρ Denote by p x and p y the prices of goods x and y respectively. the constraint optimization problem for the consumer is: max U(x, y; ρ, α) subject to x 0, y 0 and p xx + p y y = M x,y C. Hurtado (UIUC - Economics) Numerical Methods 27 / 27

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Second Order Optimization Methods Marc Toussaint U Stuttgart Planned Outline Gradient-based optimization (1st order methods) plain grad., steepest descent, conjugate grad.,

More information

Numerical Optimization

Numerical Optimization Numerical Optimization Quantitative Macroeconomics Raül Santaeulàlia-Llopis MOVE-UAB and Barcelona GSE Fall 2018 Raül Santaeulàlia-Llopis (MOVE-UAB,BGSE) QM: Numerical Optimization Fall 2018 1 / 46 1 Introduction

More information

Modern Methods of Data Analysis - WS 07/08

Modern Methods of Data Analysis - WS 07/08 Modern Methods of Data Analysis Lecture XV (04.02.08) Contents: Function Minimization (see E. Lohrmann & V. Blobel) Optimization Problem Set of n independent variables Sometimes in addition some constraints

More information

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient Optimization Last time Root finding: definition, motivation Algorithms: Bisection, false position, secant, Newton-Raphson Convergence & tradeoffs Example applications of Newton s method Root finding in

More information

Multivariate Numerical Optimization

Multivariate Numerical Optimization Jianxin Wei March 1, 2013 Outline 1 Graphics for Function of Two Variables 2 Nelder-Mead Simplex Method 3 Steepest Descent Method 4 Newton s Method 5 Quasi-Newton s Method 6 Built-in R Function 7 Linear

More information

Computational Methods. Constrained Optimization

Computational Methods. Constrained Optimization Computational Methods Constrained Optimization Manfred Huber 2010 1 Constrained Optimization Unconstrained Optimization finds a minimum of a function under the assumption that the parameters can take on

More information

Introduction to optimization methods and line search

Introduction to optimization methods and line search Introduction to optimization methods and line search Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi How to find optimal solutions? Trial and error widely used in practice, not efficient and

More information

Newton and Quasi-Newton Methods

Newton and Quasi-Newton Methods Lab 17 Newton and Quasi-Newton Methods Lab Objective: Newton s method is generally useful because of its fast convergence properties. However, Newton s method requires the explicit calculation of the second

More information

Lecture 6 - Multivariate numerical optimization

Lecture 6 - Multivariate numerical optimization Lecture 6 - Multivariate numerical optimization Björn Andersson (w/ Jianxin Wei) Department of Statistics, Uppsala University February 13, 2014 1 / 36 Table of Contents 1 Plotting functions of two variables

More information

Numerical Optimization: Introduction and gradient-based methods

Numerical Optimization: Introduction and gradient-based methods Numerical Optimization: Introduction and gradient-based methods Master 2 Recherche LRI Apprentissage Statistique et Optimisation Anne Auger Inria Saclay-Ile-de-France November 2011 http://tao.lri.fr/tiki-index.php?page=courses

More information

Constrained Optimization

Constrained Optimization Constrained Optimization Dudley Cooke Trinity College Dublin Dudley Cooke (Trinity College Dublin) Constrained Optimization 1 / 46 EC2040 Topic 5 - Constrained Optimization Reading 1 Chapters 12.1-12.3

More information

Lecture 2 Optimization with equality constraints

Lecture 2 Optimization with equality constraints Lecture 2 Optimization with equality constraints Constrained optimization The idea of constrained optimisation is that the choice of one variable often affects the amount of another variable that can be

More information

Introduction to Optimization Problems and Methods

Introduction to Optimization Problems and Methods Introduction to Optimization Problems and Methods wjch@umich.edu December 10, 2009 Outline 1 Linear Optimization Problem Simplex Method 2 3 Cutting Plane Method 4 Discrete Dynamic Programming Problem Simplex

More information

Characterizing Improving Directions Unconstrained Optimization

Characterizing Improving Directions Unconstrained Optimization Final Review IE417 In the Beginning... In the beginning, Weierstrass's theorem said that a continuous function achieves a minimum on a compact set. Using this, we showed that for a convex set S and y not

More information

Local and Global Minimum

Local and Global Minimum Local and Global Minimum Stationary Point. From elementary calculus, a single variable function has a stationary point at if the derivative vanishes at, i.e., 0. Graphically, the slope of the function

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June

More information

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer David G. Luenberger Yinyu Ye Linear and Nonlinear Programming Fourth Edition ö Springer Contents 1 Introduction 1 1.1 Optimization 1 1.2 Types of Problems 2 1.3 Size of Problems 5 1.4 Iterative Algorithms

More information

EC5555 Economics Masters Refresher Course in Mathematics September Lecture 6 Optimization with equality constraints Francesco Feri

EC5555 Economics Masters Refresher Course in Mathematics September Lecture 6 Optimization with equality constraints Francesco Feri EC5555 Economics Masters Refresher Course in Mathematics September 2013 Lecture 6 Optimization with equality constraints Francesco Feri Constrained optimization The idea of constrained optimisation is

More information

MATH3016: OPTIMIZATION

MATH3016: OPTIMIZATION MATH3016: OPTIMIZATION Lecturer: Dr Huifu Xu School of Mathematics University of Southampton Highfield SO17 1BJ Southampton Email: h.xu@soton.ac.uk 1 Introduction What is optimization? Optimization is

More information

Convexization in Markov Chain Monte Carlo

Convexization in Markov Chain Monte Carlo in Markov Chain Monte Carlo 1 IBM T. J. Watson Yorktown Heights, NY 2 Department of Aerospace Engineering Technion, Israel August 23, 2011 Problem Statement MCMC processes in general are governed by non

More information

CMU-Q Lecture 9: Optimization II: Constrained,Unconstrained Optimization Convex optimization. Teacher: Gianni A. Di Caro

CMU-Q Lecture 9: Optimization II: Constrained,Unconstrained Optimization Convex optimization. Teacher: Gianni A. Di Caro CMU-Q 15-381 Lecture 9: Optimization II: Constrained,Unconstrained Optimization Convex optimization Teacher: Gianni A. Di Caro GLOBAL FUNCTION OPTIMIZATION Find the global maximum of the function f x (and

More information

APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING

APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING Second Edition P. Venkataraman Rochester Institute of Technology WILEY JOHN WILEY & SONS, INC. CONTENTS PREFACE xiii 1 Introduction 1 1.1. Optimization Fundamentals

More information

Minima, Maxima, Saddle points

Minima, Maxima, Saddle points Minima, Maxima, Saddle points Levent Kandiller Industrial Engineering Department Çankaya University, Turkey Minima, Maxima, Saddle points p./9 Scalar Functions Let us remember the properties for maxima,

More information

Convexity Theory and Gradient Methods

Convexity Theory and Gradient Methods Convexity Theory and Gradient Methods Angelia Nedić angelia@illinois.edu ISE Department and Coordinated Science Laboratory University of Illinois at Urbana-Champaign Outline Convex Functions Optimality

More information

Experimental Data and Training

Experimental Data and Training Modeling and Control of Dynamic Systems Experimental Data and Training Mihkel Pajusalu Alo Peets Tartu, 2008 1 Overview Experimental data Designing input signal Preparing data for modeling Training Criterion

More information

Chapter 3 Numerical Methods

Chapter 3 Numerical Methods Chapter 3 Numerical Methods Part 1 3.1 Linearization and Optimization of Functions of Vectors 1 Problem Notation 2 Outline 3.1.1 Linearization 3.1.2 Optimization of Objective Functions 3.1.3 Constrained

More information

Short Reminder of Nonlinear Programming

Short Reminder of Nonlinear Programming Short Reminder of Nonlinear Programming Kaisa Miettinen Dept. of Math. Inf. Tech. Email: kaisa.miettinen@jyu.fi Homepage: http://www.mit.jyu.fi/miettine Contents Background General overview briefly theory

More information

A Brief Look at Optimization

A Brief Look at Optimization A Brief Look at Optimization CSC 412/2506 Tutorial David Madras January 18, 2018 Slides adapted from last year s version Overview Introduction Classes of optimization problems Linear programming Steepest

More information

INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING

INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING DAVID G. LUENBERGER Stanford University TT ADDISON-WESLEY PUBLISHING COMPANY Reading, Massachusetts Menlo Park, California London Don Mills, Ontario CONTENTS

More information

Optimization with Scipy

Optimization with Scipy Lab 15 Optimization with Scipy Lab Objective: The Optimize package in Scipy provides highly optimized and versatile methods for solving fundamental optimization problems. In this lab we introduce the syntax

More information

25. NLP algorithms. ˆ Overview. ˆ Local methods. ˆ Constrained optimization. ˆ Global methods. ˆ Black-box methods.

25. NLP algorithms. ˆ Overview. ˆ Local methods. ˆ Constrained optimization. ˆ Global methods. ˆ Black-box methods. CS/ECE/ISyE 524 Introduction to Optimization Spring 2017 18 25. NLP algorithms ˆ Overview ˆ Local methods ˆ Constrained optimization ˆ Global methods ˆ Black-box methods ˆ Course wrap-up Laurent Lessard

More information

Constrained Optimization COS 323

Constrained Optimization COS 323 Constrained Optimization COS 323 Last time Introduction to optimization objective function, variables, [constraints] 1-dimensional methods Golden section, discussion of error Newton s method Multi-dimensional

More information

Parameters Estimation of Material Constitutive Models using Optimization Algorithms

Parameters Estimation of Material Constitutive Models using Optimization Algorithms The University of Akron IdeaExchange@UAkron Honors Research Projects The Dr. Gary B. and Pamela S. Williams Honors College Spring 2015 Parameters Estimation of Material Constitutive Models using Optimization

More information

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited. page v Preface xiii I Basics 1 1 Optimization Models 3 1.1 Introduction... 3 1.2 Optimization: An Informal Introduction... 4 1.3 Linear Equations... 7 1.4 Linear Optimization... 10 Exercises... 12 1.5

More information

Optimization. Industrial AI Lab.

Optimization. Industrial AI Lab. Optimization Industrial AI Lab. Optimization An important tool in 1) Engineering problem solving and 2) Decision science People optimize Nature optimizes 2 Optimization People optimize (source: http://nautil.us/blog/to-save-drowning-people-ask-yourself-what-would-light-do)

More information

Chapter II. Linear Programming

Chapter II. Linear Programming 1 Chapter II Linear Programming 1. Introduction 2. Simplex Method 3. Duality Theory 4. Optimality Conditions 5. Applications (QP & SLP) 6. Sensitivity Analysis 7. Interior Point Methods 1 INTRODUCTION

More information

Solution Methods Numerical Algorithms

Solution Methods Numerical Algorithms Solution Methods Numerical Algorithms Evelien van der Hurk DTU Managment Engineering Class Exercises From Last Time 2 DTU Management Engineering 42111: Static and Dynamic Optimization (6) 09/10/2017 Class

More information

Optimization. (Lectures on Numerical Analysis for Economists III) Jesús Fernández-Villaverde 1 and Pablo Guerrón 2 February 20, 2018

Optimization. (Lectures on Numerical Analysis for Economists III) Jesús Fernández-Villaverde 1 and Pablo Guerrón 2 February 20, 2018 Optimization (Lectures on Numerical Analysis for Economists III) Jesús Fernández-Villaverde 1 and Pablo Guerrón 2 February 20, 2018 1 University of Pennsylvania 2 Boston College Optimization Optimization

More information

(1) Given the following system of linear equations, which depends on a parameter a R, 3x y + 5z = 2 4x + y + (a 2 14)z = a + 2

(1) Given the following system of linear equations, which depends on a parameter a R, 3x y + 5z = 2 4x + y + (a 2 14)z = a + 2 (1 Given the following system of linear equations, which depends on a parameter a R, x + 2y 3z = 4 3x y + 5z = 2 4x + y + (a 2 14z = a + 2 (a Classify the system of equations depending on the values of

More information

SYSTEMS OF NONLINEAR EQUATIONS

SYSTEMS OF NONLINEAR EQUATIONS SYSTEMS OF NONLINEAR EQUATIONS Widely used in the mathematical modeling of real world phenomena. We introduce some numerical methods for their solution. For better intuition, we examine systems of two

More information

Graphing Techniques. Domain (, ) Range (, ) Squaring Function f(x) = x 2 Domain (, ) Range [, ) f( x) = x 2

Graphing Techniques. Domain (, ) Range (, ) Squaring Function f(x) = x 2 Domain (, ) Range [, ) f( x) = x 2 Graphing Techniques In this chapter, we will take our knowledge of graphs of basic functions and expand our ability to graph polynomial and rational functions using common sense, zeros, y-intercepts, stretching

More information

Convexity and Optimization

Convexity and Optimization Convexity and Optimization Richard Lusby Department of Management Engineering Technical University of Denmark Today s Material Extrema Convex Function Convex Sets Other Convexity Concepts Unconstrained

More information

Classical Gradient Methods

Classical Gradient Methods Classical Gradient Methods Note simultaneous course at AMSI (math) summer school: Nonlin. Optimization Methods (see http://wwwmaths.anu.edu.au/events/amsiss05/) Recommended textbook (Springer Verlag, 1999):

More information

Introduction. Optimization

Introduction. Optimization Introduction to Optimization Amy Langville SAMSI Undergraduate Workshop N.C. State University SAMSI 6/1/05 GOAL: minimize f(x 1, x 2, x 3, x 4, x 5 ) = x 2 1.5x 2x 3 + x 4 /x 5 PRIZE: $1 million # of independent

More information

21-256: Lagrange multipliers

21-256: Lagrange multipliers 21-256: Lagrange multipliers Clive Newstead, Thursday 12th June 2014 Lagrange multipliers give us a means of optimizing multivariate functions subject to a number of constraints on their variables. Problems

More information

Multi Layer Perceptron trained by Quasi Newton learning rule

Multi Layer Perceptron trained by Quasi Newton learning rule Multi Layer Perceptron trained by Quasi Newton learning rule Feed-forward neural networks provide a general framework for representing nonlinear functional mappings between a set of input variables and

More information

Theoretical Concepts of Machine Learning

Theoretical Concepts of Machine Learning Theoretical Concepts of Machine Learning Part 2 Institute of Bioinformatics Johannes Kepler University, Linz, Austria Outline 1 Introduction 2 Generalization Error 3 Maximum Likelihood 4 Noise Models 5

More information

Convexity and Optimization

Convexity and Optimization Convexity and Optimization Richard Lusby DTU Management Engineering Class Exercises From Last Time 2 DTU Management Engineering 42111: Static and Dynamic Optimization (3) 18/09/2017 Today s Material Extrema

More information

MEI STRUCTURED MATHEMATICS METHODS FOR ADVANCED MATHEMATICS, C3. Practice Paper C3-B

MEI STRUCTURED MATHEMATICS METHODS FOR ADVANCED MATHEMATICS, C3. Practice Paper C3-B MEI Mathematics in Education and Industry MEI STRUCTURED MATHEMATICS METHODS FOR ADVANCED MATHEMATICS, C3 Practice Paper C3-B Additional materials: Answer booklet/paper Graph paper List of formulae (MF)

More information

Today s class. Roots of equation Finish up incremental search Open methods. Numerical Methods, Fall 2011 Lecture 5. Prof. Jinbo Bi CSE, UConn

Today s class. Roots of equation Finish up incremental search Open methods. Numerical Methods, Fall 2011 Lecture 5. Prof. Jinbo Bi CSE, UConn Today s class Roots of equation Finish up incremental search Open methods 1 False Position Method Although the interval [a,b] where the root becomes iteratively closer with the false position method, unlike

More information

A Study on the Optimization Methods for Optomechanical Alignment

A Study on the Optimization Methods for Optomechanical Alignment A Study on the Optimization Methods for Optomechanical Alignment Ming-Ta Yu a, Tsung-Yin Lin b *, Yi-You Li a, and Pei-Feng Shu a a Dept. of Mech. Eng., National Chiao Tung University, Hsinchu 300, Taiwan,

More information

Recent Developments in Model-based Derivative-free Optimization

Recent Developments in Model-based Derivative-free Optimization Recent Developments in Model-based Derivative-free Optimization Seppo Pulkkinen April 23, 2010 Introduction Problem definition The problem we are considering is a nonlinear optimization problem with constraints:

More information

Biostatistics 615/815 Lecture 13: Numerical Optimization

Biostatistics 615/815 Lecture 13: Numerical Optimization Biostatistics 615/815 Lecture 13: Numerical Optimization Hyun Min Kang October 27th, 2011 Hyun Min Kang Biostatistics 615/815 - Lecture 13 October 27th, 2011 1 / 35 The Problem Hyun Min Kang Biostatistics

More information

Machine Learning for Signal Processing Lecture 4: Optimization

Machine Learning for Signal Processing Lecture 4: Optimization Machine Learning for Signal Processing Lecture 4: Optimization 13 Sep 2015 Instructor: Bhiksha Raj (slides largely by Najim Dehak, JHU) 11-755/18-797 1 Index 1. The problem of optimization 2. Direct optimization

More information

Key Concepts: Economic Computation, Part II

Key Concepts: Economic Computation, Part II Key Concepts: Economic Computation, Part II Brent Hickman Fall, 2009 The purpose of the second section of these notes is to give you some further practice with numerical computation in MATLAB, and also

More information

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini DM545 Linear and Integer Programming Lecture 2 The Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. 2. 3. 4. Standard Form Basic Feasible Solutions

More information

3.3 Optimizing Functions of Several Variables 3.4 Lagrange Multipliers

3.3 Optimizing Functions of Several Variables 3.4 Lagrange Multipliers 3.3 Optimizing Functions of Several Variables 3.4 Lagrange Multipliers Prof. Tesler Math 20C Fall 2018 Prof. Tesler 3.3 3.4 Optimization Math 20C / Fall 2018 1 / 56 Optimizing y = f (x) In Math 20A, we

More information

Interpolation. TANA09 Lecture 7. Error analysis for linear interpolation. Linear Interpolation. Suppose we have a table x x 1 x 2...

Interpolation. TANA09 Lecture 7. Error analysis for linear interpolation. Linear Interpolation. Suppose we have a table x x 1 x 2... TANA9 Lecture 7 Interpolation Suppose we have a table x x x... x n+ Interpolation Introduction. Polynomials. Error estimates. Runge s phenomena. Application - Equation solving. Spline functions and interpolation.

More information

A large number of user subroutines and utility routines is available in Abaqus, that are all programmed in Fortran. Subroutines are different for

A large number of user subroutines and utility routines is available in Abaqus, that are all programmed in Fortran. Subroutines are different for 1 2 3 A large number of user subroutines and utility routines is available in Abaqus, that are all programmed in Fortran. Subroutines are different for implicit (standard) and explicit solvers. Utility

More information

Performance Evaluation of an Interior Point Filter Line Search Method for Constrained Optimization

Performance Evaluation of an Interior Point Filter Line Search Method for Constrained Optimization 6th WSEAS International Conference on SYSTEM SCIENCE and SIMULATION in ENGINEERING, Venice, Italy, November 21-23, 2007 18 Performance Evaluation of an Interior Point Filter Line Search Method for Constrained

More information

5 Day 5: Maxima and minima for n variables.

5 Day 5: Maxima and minima for n variables. UNIVERSITAT POMPEU FABRA INTERNATIONAL BUSINESS ECONOMICS MATHEMATICS III. Pelegrí Viader. 2012-201 Updated May 14, 201 5 Day 5: Maxima and minima for n variables. The same kind of first-order and second-order

More information

Convex sets and convex functions

Convex sets and convex functions Convex sets and convex functions Convex optimization problems Convex sets and their examples Separating and supporting hyperplanes Projections on convex sets Convex functions, conjugate functions ECE 602,

More information

Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles

Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles INTERNATIONAL JOURNAL OF MATHEMATICS MODELS AND METHODS IN APPLIED SCIENCES Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles M. Fernanda P.

More information

f( x ), or a solution to the equation f( x) 0. You are already familiar with ways of solving

f( x ), or a solution to the equation f( x) 0. You are already familiar with ways of solving The Bisection Method and Newton s Method. If f( x ) a function, then a number r for which f( r) 0 is called a zero or a root of the function f( x ), or a solution to the equation f( x) 0. You are already

More information

CS281 Section 3: Practical Optimization

CS281 Section 3: Practical Optimization CS281 Section 3: Practical Optimization David Duvenaud and Dougal Maclaurin Most parameter estimation problems in machine learning cannot be solved in closed form, so we often have to resort to numerical

More information

Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms

Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 13, NO. 5, SEPTEMBER 2002 1225 Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms S. Sathiya Keerthi Abstract This paper

More information

Optimization in Scilab

Optimization in Scilab Scilab sheet Optimization in Scilab Scilab provides a high-level matrix language and allows to define complex mathematical models and to easily connect to existing libraries. That is why optimization is

More information

Bilinear Programming

Bilinear Programming Bilinear Programming Artyom G. Nahapetyan Center for Applied Optimization Industrial and Systems Engineering Department University of Florida Gainesville, Florida 32611-6595 Email address: artyom@ufl.edu

More information

A Derivative-Free Approximate Gradient Sampling Algorithm for Finite Minimax Problems

A Derivative-Free Approximate Gradient Sampling Algorithm for Finite Minimax Problems 1 / 33 A Derivative-Free Approximate Gradient Sampling Algorithm for Finite Minimax Problems Speaker: Julie Nutini Joint work with Warren Hare University of British Columbia (Okanagan) III Latin American

More information

Lagrangian Multipliers

Lagrangian Multipliers Università Ca Foscari di Venezia - Dipartimento di Management - A.A.2017-2018 Mathematics Lagrangian Multipliers Luciano Battaia November 15, 2017 1 Two variables functions and constraints Consider a two

More information

Algorithms for convex optimization

Algorithms for convex optimization Algorithms for convex optimization Michal Kočvara Institute of Information Theory and Automation Academy of Sciences of the Czech Republic and Czech Technical University kocvara@utia.cas.cz http://www.utia.cas.cz/kocvara

More information

Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers

Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers In this section we present Lagrange s method for maximizing or minimizing a general function f(x, y, z) subject to a constraint (or side condition) of the form g(x, y, z) = k. Figure 1 shows this curve

More information

5 Machine Learning Abstractions and Numerical Optimization

5 Machine Learning Abstractions and Numerical Optimization Machine Learning Abstractions and Numerical Optimization 25 5 Machine Learning Abstractions and Numerical Optimization ML ABSTRACTIONS [some meta comments on machine learning] [When you write a large computer

More information

Optimal Control Techniques for Dynamic Walking

Optimal Control Techniques for Dynamic Walking Optimal Control Techniques for Dynamic Walking Optimization in Robotics & Biomechanics IWR, University of Heidelberg Presentation partly based on slides by Sebastian Sager, Moritz Diehl and Peter Riede

More information

Lecture 25 Nonlinear Programming. November 9, 2009

Lecture 25 Nonlinear Programming. November 9, 2009 Nonlinear Programming November 9, 2009 Outline Nonlinear Programming Another example of NLP problem What makes these problems complex Scalar Function Unconstrained Problem Local and global optima: definition,

More information

LocalSolver 4.0: novelties and benchmarks

LocalSolver 4.0: novelties and benchmarks LocalSolver 4.0: novelties and benchmarks Thierry Benoist Julien Darlay Bertrand Estellon Frédéric Gardi Romain Megel www.localsolver.com 1/18 LocalSolver 3.1 Solver for combinatorial optimization Simple

More information

A projected Hessian matrix for full waveform inversion Yong Ma and Dave Hale, Center for Wave Phenomena, Colorado School of Mines

A projected Hessian matrix for full waveform inversion Yong Ma and Dave Hale, Center for Wave Phenomena, Colorado School of Mines A projected Hessian matrix for full waveform inversion Yong Ma and Dave Hale, Center for Wave Phenomena, Colorado School of Mines SUMMARY A Hessian matrix in full waveform inversion (FWI) is difficult

More information

Ellipsoid Algorithm :Algorithms in the Real World. Ellipsoid Algorithm. Reduction from general case

Ellipsoid Algorithm :Algorithms in the Real World. Ellipsoid Algorithm. Reduction from general case Ellipsoid Algorithm 15-853:Algorithms in the Real World Linear and Integer Programming II Ellipsoid algorithm Interior point methods First polynomial-time algorithm for linear programming (Khachian 79)

More information

3.3 Function minimization

3.3 Function minimization 3.3. Function minimization 55 3.3 Function minimization Beneath the problem of root-finding, minimizing functions constitutes a major problem in computational economics. Let f () : X R a function that

More information

1. How many white tiles will be in Design 5 of the pattern? Explain your reasoning.

1. How many white tiles will be in Design 5 of the pattern? Explain your reasoning. Algebra 2 Semester 1 Review Answer the question for each pattern. 1. How many white tiles will be in Design 5 of the pattern Explain your reasoning. 2. What is another way to represent the expression 3.

More information

Introduction to Linear Programming. Algorithmic and Geometric Foundations of Optimization

Introduction to Linear Programming. Algorithmic and Geometric Foundations of Optimization Introduction to Linear Programming Algorithmic and Geometric Foundations of Optimization Optimization and Linear Programming Mathematical programming is a class of methods for solving problems which ask

More information

Laboratory exercise. Laboratory experiment OPT-1 Nonlinear Optimization

Laboratory exercise. Laboratory experiment OPT-1 Nonlinear Optimization Fachgebiet Simulation und Optimale Prozesse Fakultät für Informatik und Automatisierung Institut für Automatisierungsund Systemtechnik Laboratory exercise Laboratory experiment OPT-1 Nonlinear Optimization

More information

. Tutorial Class V 3-10/10/2012 First Order Partial Derivatives;...

. Tutorial Class V 3-10/10/2012 First Order Partial Derivatives;... Tutorial Class V 3-10/10/2012 1 First Order Partial Derivatives; Tutorial Class V 3-10/10/2012 1 First Order Partial Derivatives; 2 Application of Gradient; Tutorial Class V 3-10/10/2012 1 First Order

More information

NMath Analysis User s Guide

NMath Analysis User s Guide NMath Analysis User s Guide Version 2.0 CenterSpace Software Corvallis, Oregon NMATH ANALYSIS USER S GUIDE 2009 Copyright CenterSpace Software, LLC. All Rights Reserved. The correct bibliographic reference

More information

1. Show that the rectangle of maximum area that has a given perimeter p is a square.

1. Show that the rectangle of maximum area that has a given perimeter p is a square. Constrained Optimization - Examples - 1 Unit #23 : Goals: Lagrange Multipliers To study constrained optimization; that is, the maximizing or minimizing of a function subject to a constraint (or side condition).

More information

EC422 Mathematical Economics 2

EC422 Mathematical Economics 2 EC422 Mathematical Economics 2 Chaiyuth Punyasavatsut Chaiyuth Punyasavatust 1 Course materials and evaluation Texts: Dixit, A.K ; Sydsaeter et al. Grading: 40,30,30. OK or not. Resources: ftp://econ.tu.ac.th/class/archan/c

More information

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach Basic approaches I. Primal Approach - Feasible Direction

More information

x 6 + λ 2 x 6 = for the curve y = 1 2 x3 gives f(1, 1 2 ) = λ actually has another solution besides λ = 1 2 = However, the equation λ

x 6 + λ 2 x 6 = for the curve y = 1 2 x3 gives f(1, 1 2 ) = λ actually has another solution besides λ = 1 2 = However, the equation λ Math 0 Prelim I Solutions Spring 010 1. Let f(x, y) = x3 y for (x, y) (0, 0). x 6 + y (4 pts) (a) Show that the cubic curves y = x 3 are level curves of the function f. Solution. Substituting y = x 3 in

More information

Lagrange multipliers. Contents. Introduction. From Wikipedia, the free encyclopedia

Lagrange multipliers. Contents. Introduction. From Wikipedia, the free encyclopedia Lagrange multipliers From Wikipedia, the free encyclopedia In mathematical optimization problems, Lagrange multipliers, named after Joseph Louis Lagrange, is a method for finding the local extrema of a

More information

Mid Term Pre Calc Review

Mid Term Pre Calc Review Mid Term 2015-13 Pre Calc Review I. Quadratic Functions a. Solve by quadratic formula, completing the square, or factoring b. Find the vertex c. Find the axis of symmetry d. Graph the quadratic function

More information

22. LECTURE 22. I can define critical points. I know the difference between local and absolute minimums/maximums.

22. LECTURE 22. I can define critical points. I know the difference between local and absolute minimums/maximums. . LECTURE Objectives I can define critical points. I know the difference between local and absolute minimums/maximums. In many physical problems, we re interested in finding the values (x, y) that maximize

More information

ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints

ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints Instructor: Kevin Ross Scribe: Pritam Roy May 0, 2005 Outline of topics for the lecture We will discuss

More information

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods 1 Nonlinear Programming Types of Nonlinear Programs (NLP) Convexity and Convex Programs NLP Solutions Unconstrained Optimization Principles of Unconstrained Optimization Search Methods Constrained Optimization

More information

Unconstrained Optimization

Unconstrained Optimization Unconstrained Optimization Joshua Wilde, revised by Isabel Tecu, Takeshi Suzuki and María José Boccardi August 13, 2013 1 Denitions Economics is a science of optima We maximize utility functions, minimize

More information

Logistic Regression

Logistic Regression Logistic Regression ddebarr@uw.edu 2016-05-26 Agenda Model Specification Model Fitting Bayesian Logistic Regression Online Learning and Stochastic Optimization Generative versus Discriminative Classifiers

More information

Hw 4 Due Feb 22. D(fg) x y z (

Hw 4 Due Feb 22. D(fg) x y z ( Hw 4 Due Feb 22 2.2 Exercise 7,8,10,12,15,18,28,35,36,46 2.3 Exercise 3,11,39,40,47(b) 2.4 Exercise 6,7 Use both the direct method and product rule to calculate where f(x, y, z) = 3x, g(x, y, z) = ( 1

More information

A Randomized Algorithm for Minimizing User Disturbance Due to Changes in Cellular Technology

A Randomized Algorithm for Minimizing User Disturbance Due to Changes in Cellular Technology A Randomized Algorithm for Minimizing User Disturbance Due to Changes in Cellular Technology Carlos A. S. OLIVEIRA CAO Lab, Dept. of ISE, University of Florida Gainesville, FL 32611, USA David PAOLINI

More information

D-Optimal Designs. Chapter 888. Introduction. D-Optimal Design Overview

D-Optimal Designs. Chapter 888. Introduction. D-Optimal Design Overview Chapter 888 Introduction This procedure generates D-optimal designs for multi-factor experiments with both quantitative and qualitative factors. The factors can have a mixed number of levels. For example,

More information

NOTATION AND TERMINOLOGY

NOTATION AND TERMINOLOGY 15.053x, Optimization Methods in Business Analytics Fall, 2016 October 4, 2016 A glossary of notation and terms used in 15.053x Weeks 1, 2, 3, 4 and 5. (The most recent week's terms are in blue). NOTATION

More information

10703 Deep Reinforcement Learning and Control

10703 Deep Reinforcement Learning and Control 10703 Deep Reinforcement Learning and Control Russ Salakhutdinov Machine Learning Department rsalakhu@cs.cmu.edu Policy Gradient I Used Materials Disclaimer: Much of the material and slides for this lecture

More information