Local and Global Minimum

Size: px
Start display at page:

Download "Local and Global Minimum"

Transcription

1 Local and Global Minimum Stationary Point. From elementary calculus, a single variable function has a stationary point at if the derivative vanishes at, i.e., 0. Graphically, the slope of the function is zero at the stationary point, which may represent a minimum, a maximum, or a point of inflexion. Local Minimum. A multi variable function,, has a local minimum at if in a small neighborhood around, defined by. Global Minimum. The multi variable function has a global minimum at if for all in a feasible region defined by the problem.

2 Necessary and Sufficient Conditions The necessary conditions must be satisfied at the optimum point. However, other points (maxima, inflexion points) may also satisfy the necessary conditions. If a candidate point satisfies the sufficient condition, then it is indeed the optimum point. However, not being able to satisfy the sufficient conditions does not preclude the existence of an optimum point.

3 Extreme Value Theorem The Extreme Value Theorem (attributed to Karl Weirstrass) provides sufficient conditions for the existence of minimum (or maximum) of a function defined over a complex domain. The theorem states: A continuous function defined over a closed and bounded set attains its maximum and minimum in. According to this theorem, if the feasible region Ω of the problem is closed and bounded, a minimum for the problem exists.

4 Optimality Criteria: Unconstrained Problems Consider a multi variable function,,,, where we wish to investigate the behavior of a candidate point. The point is a local minimum of only if in a small neighborhood of. Let, and use first order Taylor series expansion of to write: 0. Since are arbitrary, the first order necessary condition (FONC) for a local minimum is given as: FONC: If has a local minimum at, then equivalently, 0, 1,,. 0; The points that satisfy FONC are called stationary points of. Besides minima, these include maxima and the points of inflexion.

5 Polynomial Data Fitting Problem: Fit an th degree polynomial: to data points:,, 1,,, such that the mean square error (MSE) is minimized min 1 2 FONC: 0 For 1, 1 Finally, since the problem is convex, FONC are both necessary and sufficient for a minimum.

6 Second Order Conditions Assume now that FONC are satisfied, i.e., 0. Then, we may use second order Taylor series expansion of to write the optimality condition as: 0. The above quadratic form is positive (semi)definite if and only if the Hessian matrix,, is positive (semi)definite. Therefore, the second order necessary condition (SONC) is stated as: SONC: If is a local minimizer of, then 0. A stronger second order sufficient condition is given as: SOSC: If satisfies 0, then is a local minimizer of. In the event that 0, the lowest nonzero derivative must be even ordered for stationary points (necessary condition), and be positive for local minimum (sufficient condition).

7 Optimality Criteria: Equality Constrained Problems Consider an optimization problem with a single equality constraint: min,, subject to: 0 Consider the variation in the objective and constraint functions at 0 0 Or 0 Define a Lagrangian function:,, then FONC: 0; 1,, 0

8 Equality Constrained Problems Consider a problem with multiple equality constrains: min,, subject to: 0, 1,, Define a Lagrangian function:, FONC: 0, 1,, 0; 1,, Or, equivalently,, 0,, 0 A total equations, need to be simultaneously solved for, 1,,and, 1,, Since the equality constraints can be multiplied by 1 without changing the solution, the Lagrange multipliers for the equality constraint can take on both positive and negative values.

9 Example: soda can design:, subject to: 2000 min,

10 Example: soda can design:, subject to: 2000 min, Define,, FONC: FONC are solved as: 6.34, 0.63.

11 Example: soda can design Alternatively, use the equality constraint to solve for as: Define the unconstrained problem: min FONC: 0 FONC are solved to get: SONC: ,, 200

12 Example min,, Subject to:, 10 Define,, 1 FONC: 2 1 0,2 1 0, 10 Solution:,,, with,, Alternatively, substitute 1 Solve unconstrained problem: min 1 1 Solution: ;,,, with,

13 Inequality Constrained Problems Consider the inequality constrained problem: min,, subject to: 0, 1,, Introduce slack variables to convert the constraints to equality: 0, 1,, Define the Lagrangian:,, FONC: 0, 0, 1,, 0; 1,, Note, 0, 1,,define switching conditions of the form: 0or 0. These require a total of 2 cases to be explored for feasibility and optimality.

14 Example: soda can design, subject to: min, Use slack variable to write: Define,, FONC: For 0we obtain No feasible solution for 0 0 ;

15 Example min,, Subject to:, 10 Use slack variable:, 1 0 Define,,, 1 FONC: 2 0, 2 0, 10, 0 For 0, we obtain:, 0,0, 0. For 0, we obtain:,,, with, and,

16 Optimality Criteria: General Optimization Problems Consider the general optimization problem: min, subject to: 0, 1,, ; 0,,, Add slack variables and define the Lagrangian function:,,, FONC (KKT): Gradient: 0; 1,, Feasibility: 0, 1,,; 0, 1,, Switching: 0, 1,, Non negativity: 0, 1,, Regularity : for those 0, are linearly independent Note, 2variables; 2 cases

17 Example: soda can design min,, subject to: 0, 20

18 Example: soda can design min,, subject to: 0, 20 Use slack variable to write: 2 0 Define,,, 2 FONC (KKT): For 0we obtain:, 2 ;, ; No feasible solution for 0

19 Second Order Conditions for Constrained Problems Assume that satisfies the FONC (KKT) conditions, Hessian of the Lagrangian: Define the set of active constraints: : 0, 0 Define active constraint tangent hyperplane as: : 0, 0, SONC: If is a local minimizer of, then 0 SOSC: For : 0, 0, 0, if satisfies 0, then is a local minimizer of SOSC: If 0, then is a local minimizer of

20 Consider min,, Subject to:, : 1 0;, : 0 Lagrangian:,,,, 1 FONC (KKT): 2 0, 2 0, 0, 10, 0 Solution:,,, 1, 0; ;, Active constraint tangent hyperplane: 0 Hessian of the Lagrangian at, 0 1 : 1 0 SOSC: 2 0, indicating an optimum

21 Convex Optimization Problems Example: min,, Subject to:, : 1 0;, : 0 Note: 0, 0; is linear; hence, problem is convex. Define,,,, 1. KKT: 2 1 0, 21 0, 0, 10, 0 For 0:,,, 1, ;, NFS for 0 SOSC: 0

22 Example: design of rectangular beam: min, Subject to: :. 100, :., 20 : 2 0, : 0, : 0 Lagrangian:,,, FONC (KKT): , 0, 0, 0; 15 Note, no feasible solution for 0or 0

23 Drop, to write:,,, FONC (KKT): , 0, 0, 0; 1 3 Solution cases: Results , 0; , 0; , ; NFS NFS , ; NFS

24 The Hessian of the Lagrangian evaluates as: / /. The constraint tangent hyperplane for the active constraint is defined by: / / 0, or 1 /. The SONC evaluate as: 0, indicating there is no isolated minimum for the problem The case 0, 0, 0 above generates a family of optimal solutions, each with These solutions are bounded as: , , and , with associated These constitute the global optimum for the problem

25 Post Optimality Analysis We are interested in studying the change in the objective function value resulting from relaxation of constraints. Consider the perturbed optimization problem: min, subject to, 1,,;,,, Let the optimum solution for the perturbed problem be expressed as:,, with the optimal cost:, ; then, ; The non zero Lagrange multipliers accompanying active constraints determine the cost function sensitivity to constraint relaxation. Non active constraints have zero Lagrange multipliers, and hence, they do not affect the solution.

26 Example: soda can design min,, subject to: 0, 20 Optimum solution:, 2 ; ; Define the perturbed problem min,, subject to:,, 2 Variation in the cost function,, 0,0 6 For 0.1,, 10 3

27 Example: min,, Subject to:, : 1 0;, : 0 A local minimum for the problem exists at:, 0.786,0.618, 0.527, 0.134, Define the perturbed optimization problem: min,, Subject to:, : 1 ;, : The variation in the optimal solution is given as:, 0,0. Then, for 0.1, the new optimum is: Similarly, for 0.1, the new optimum is: 0.35.

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods 1 Nonlinear Programming Types of Nonlinear Programs (NLP) Convexity and Convex Programs NLP Solutions Unconstrained Optimization Principles of Unconstrained Optimization Search Methods Constrained Optimization

More information

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach Basic approaches I. Primal Approach - Feasible Direction

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June

More information

Chapter 3 Numerical Methods

Chapter 3 Numerical Methods Chapter 3 Numerical Methods Part 1 3.1 Linearization and Optimization of Functions of Vectors 1 Problem Notation 2 Outline 3.1.1 Linearization 3.1.2 Optimization of Objective Functions 3.1.3 Constrained

More information

Linear methods for supervised learning

Linear methods for supervised learning Linear methods for supervised learning LDA Logistic regression Naïve Bayes PLA Maximum margin hyperplanes Soft-margin hyperplanes Least squares resgression Ridge regression Nonlinear feature maps Sometimes

More information

MATH2111 Higher Several Variable Calculus Lagrange Multipliers

MATH2111 Higher Several Variable Calculus Lagrange Multipliers MATH2111 Higher Several Variable Calculus Lagrange Multipliers Dr. Jonathan Kress School of Mathematics and Statistics University of New South Wales Semester 1, 2016 [updated: February 29, 2016] JM Kress

More information

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited. page v Preface xiii I Basics 1 1 Optimization Models 3 1.1 Introduction... 3 1.2 Optimization: An Informal Introduction... 4 1.3 Linear Equations... 7 1.4 Linear Optimization... 10 Exercises... 12 1.5

More information

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer David G. Luenberger Yinyu Ye Linear and Nonlinear Programming Fourth Edition ö Springer Contents 1 Introduction 1 1.1 Optimization 1 1.2 Types of Problems 2 1.3 Size of Problems 5 1.4 Iterative Algorithms

More information

(1) Given the following system of linear equations, which depends on a parameter a R, 3x y + 5z = 2 4x + y + (a 2 14)z = a + 2

(1) Given the following system of linear equations, which depends on a parameter a R, 3x y + 5z = 2 4x + y + (a 2 14)z = a + 2 (1 Given the following system of linear equations, which depends on a parameter a R, x + 2y 3z = 4 3x y + 5z = 2 4x + y + (a 2 14z = a + 2 (a Classify the system of equations depending on the values of

More information

CME307/MS&E311 Theory Summary

CME307/MS&E311 Theory Summary CME307/MS&E311 Theory Summary Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/~yyye http://www.stanford.edu/class/msande311/

More information

21-256: Lagrange multipliers

21-256: Lagrange multipliers 21-256: Lagrange multipliers Clive Newstead, Thursday 12th June 2014 Lagrange multipliers give us a means of optimizing multivariate functions subject to a number of constraints on their variables. Problems

More information

Mathematical Programming and Research Methods (Part II)

Mathematical Programming and Research Methods (Part II) Mathematical Programming and Research Methods (Part II) 4. Convexity and Optimization Massimiliano Pontil (based on previous lecture by Andreas Argyriou) 1 Today s Plan Convex sets and functions Types

More information

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

More information

Support Vector Machines. James McInerney Adapted from slides by Nakul Verma

Support Vector Machines. James McInerney Adapted from slides by Nakul Verma Support Vector Machines James McInerney Adapted from slides by Nakul Verma Last time Decision boundaries for classification Linear decision boundary (linear classification) The Perceptron algorithm Mistake

More information

5 Day 5: Maxima and minima for n variables.

5 Day 5: Maxima and minima for n variables. UNIVERSITAT POMPEU FABRA INTERNATIONAL BUSINESS ECONOMICS MATHEMATICS III. Pelegrí Viader. 2012-201 Updated May 14, 201 5 Day 5: Maxima and minima for n variables. The same kind of first-order and second-order

More information

Chapter 15 Introduction to Linear Programming

Chapter 15 Introduction to Linear Programming Chapter 15 Introduction to Linear Programming An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Brief History of Linear Programming The goal of linear programming is to determine the values of

More information

Machine Learning for Signal Processing Lecture 4: Optimization

Machine Learning for Signal Processing Lecture 4: Optimization Machine Learning for Signal Processing Lecture 4: Optimization 13 Sep 2015 Instructor: Bhiksha Raj (slides largely by Najim Dehak, JHU) 11-755/18-797 1 Index 1. The problem of optimization 2. Direct optimization

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 2 Convex Optimization Shiqian Ma, MAT-258A: Numerical Optimization 2 2.1. Convex Optimization General optimization problem: min f 0 (x) s.t., f i

More information

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING KELLER VANDEBOGERT AND CHARLES LANNING 1. Introduction Interior point methods are, put simply, a technique of optimization where, given a problem

More information

Constrained Optimization

Constrained Optimization Constrained Optimization Dudley Cooke Trinity College Dublin Dudley Cooke (Trinity College Dublin) Constrained Optimization 1 / 46 EC2040 Topic 5 - Constrained Optimization Reading 1 Chapters 12.1-12.3

More information

COMS 4771 Support Vector Machines. Nakul Verma

COMS 4771 Support Vector Machines. Nakul Verma COMS 4771 Support Vector Machines Nakul Verma Last time Decision boundaries for classification Linear decision boundary (linear classification) The Perceptron algorithm Mistake bound for the perceptron

More information

Computational Methods. Constrained Optimization

Computational Methods. Constrained Optimization Computational Methods Constrained Optimization Manfred Huber 2010 1 Constrained Optimization Unconstrained Optimization finds a minimum of a function under the assumption that the parameters can take on

More information

Constrained and Unconstrained Optimization

Constrained and Unconstrained Optimization Constrained and Unconstrained Optimization Carlos Hurtado Department of Economics University of Illinois at Urbana-Champaign hrtdmrt2@illinois.edu Oct 10th, 2017 C. Hurtado (UIUC - Economics) Numerical

More information

Computational Optimization. Constrained Optimization Algorithms

Computational Optimization. Constrained Optimization Algorithms Computational Optimization Constrained Optimization Algorithms Same basic algorithms Repeat Determine descent direction Determine step size Take a step Until Optimal But now must consider feasibility,

More information

A Short SVM (Support Vector Machine) Tutorial

A Short SVM (Support Vector Machine) Tutorial A Short SVM (Support Vector Machine) Tutorial j.p.lewis CGIT Lab / IMSC U. Southern California version 0.zz dec 004 This tutorial assumes you are familiar with linear algebra and equality-constrained optimization/lagrange

More information

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs 15.082J and 6.855J Lagrangian Relaxation 2 Algorithms Application to LPs 1 The Constrained Shortest Path Problem (1,10) 2 (1,1) 4 (2,3) (1,7) 1 (10,3) (1,2) (10,1) (5,7) 3 (12,3) 5 (2,2) 6 Find the shortest

More information

CME307/MS&E311 Optimization Theory Summary

CME307/MS&E311 Optimization Theory Summary CME307/MS&E311 Optimization Theory Summary Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/~yyye http://www.stanford.edu/class/msande311/

More information

In other words, we want to find the domain points that yield the maximum or minimum values (extrema) of the function.

In other words, we want to find the domain points that yield the maximum or minimum values (extrema) of the function. 1 The Lagrange multipliers is a mathematical method for performing constrained optimization of differentiable functions. Recall unconstrained optimization of differentiable functions, in which we want

More information

MEI Desmos Tasks for AS Pure

MEI Desmos Tasks for AS Pure Task 1: Coordinate Geometry Intersection of a line and a curve 1. Add a quadratic curve, e.g. y = x² 4x + 1 2. Add a line, e.g. y = x 3 3. Select the points of intersection of the line and the curve. What

More information

Lecture 2 September 3

Lecture 2 September 3 EE 381V: Large Scale Optimization Fall 2012 Lecture 2 September 3 Lecturer: Caramanis & Sanghavi Scribe: Hongbo Si, Qiaoyang Ye 2.1 Overview of the last Lecture The focus of the last lecture was to give

More information

Optimization Methods: Optimization using Calculus-Stationary Points 1. Module - 2 Lecture Notes 1

Optimization Methods: Optimization using Calculus-Stationary Points 1. Module - 2 Lecture Notes 1 Optimization Methods: Optimization using Calculus-Stationary Points 1 Module - Lecture Notes 1 Stationary points: Functions of Single and Two Variables Introduction In this session, stationary points of

More information

OPTIMIZATION METHODS

OPTIMIZATION METHODS D. Nagesh Kumar Associate Professor Department of Civil Engineering, Indian Institute of Science, Bangalore - 50 0 Email : nagesh@civil.iisc.ernet.in URL: http://www.civil.iisc.ernet.in/~nagesh Brief Contents

More information

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini DM545 Linear and Integer Programming Lecture 2 The Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. 2. 3. 4. Standard Form Basic Feasible Solutions

More information

Optimization III: Constrained Optimization

Optimization III: Constrained Optimization Optimization III: Constrained Optimization CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Optimization III: Constrained Optimization

More information

Constrained Optimization and Lagrange Multipliers

Constrained Optimization and Lagrange Multipliers Constrained Optimization and Lagrange Multipliers MATH 311, Calculus III J. Robert Buchanan Department of Mathematics Fall 2011 Constrained Optimization In the previous section we found the local or absolute

More information

Lagrange Multipliers and Problem Formulation

Lagrange Multipliers and Problem Formulation Lagrange Multipliers and Problem Formulation Steven J. Miller Department of Mathematics and Statistics Williams College Williamstown, MA 01267 Abstract The method of Lagrange Multipliers (and its generalizations)

More information

INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING

INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING DAVID G. LUENBERGER Stanford University TT ADDISON-WESLEY PUBLISHING COMPANY Reading, Massachusetts Menlo Park, California London Don Mills, Ontario CONTENTS

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Maximum Margin Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE 474/574

More information

1.1 What is Microeconomics?

1.1 What is Microeconomics? 1.1 What is Microeconomics? Economics is the study of allocating limited resources to satisfy unlimited wants. Such a tension implies tradeoffs among competing goals. The analysis can be carried out at

More information

Linear Discriminant Functions: Gradient Descent and Perceptron Convergence

Linear Discriminant Functions: Gradient Descent and Perceptron Convergence Linear Discriminant Functions: Gradient Descent and Perceptron Convergence The Two-Category Linearly Separable Case (5.4) Minimizing the Perceptron Criterion Function (5.5) Role of Linear Discriminant

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Theory of Variational Inference: Inner and Outer Approximation Eric Xing Lecture 14, February 29, 2016 Reading: W & J Book Chapters Eric Xing @

More information

Linear programming and duality theory

Linear programming and duality theory Linear programming and duality theory Complements of Operations Research Giovanni Righini Linear Programming (LP) A linear program is defined by linear constraints, a linear objective function. Its variables

More information

6 Randomized rounding of semidefinite programs

6 Randomized rounding of semidefinite programs 6 Randomized rounding of semidefinite programs We now turn to a new tool which gives substantially improved performance guarantees for some problems We now show how nonlinear programming relaxations can

More information

Lecture 7: Support Vector Machine

Lecture 7: Support Vector Machine Lecture 7: Support Vector Machine Hien Van Nguyen University of Houston 9/28/2017 Separating hyperplane Red and green dots can be separated by a separating hyperplane Two classes are separable, i.e., each

More information

Lecture 15: Log Barrier Method

Lecture 15: Log Barrier Method 10-725/36-725: Convex Optimization Spring 2015 Lecturer: Ryan Tibshirani Lecture 15: Log Barrier Method Scribes: Pradeep Dasigi, Mohammad Gowayyed Note: LaTeX template courtesy of UC Berkeley EECS dept.

More information

Constrained optimization

Constrained optimization Constrained optimization A general constrained optimization problem has the form where The Lagrangian function is given by Primal and dual optimization problems Primal: Dual: Weak duality: Strong duality:

More information

Convexity and Optimization

Convexity and Optimization Convexity and Optimization Richard Lusby Department of Management Engineering Technical University of Denmark Today s Material Extrema Convex Function Convex Sets Other Convexity Concepts Unconstrained

More information

Lecture 25 Nonlinear Programming. November 9, 2009

Lecture 25 Nonlinear Programming. November 9, 2009 Nonlinear Programming November 9, 2009 Outline Nonlinear Programming Another example of NLP problem What makes these problems complex Scalar Function Unconstrained Problem Local and global optima: definition,

More information

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP):

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP): Linköping University Optimization TAOP3(0) Department of Mathematics Examination Oleg Burdakov of 30 October 03 Assignment Consider the following linear programming problem (LP): max z = x + x s.t. x x

More information

Perceptron Learning Algorithm

Perceptron Learning Algorithm Perceptron Learning Algorithm An iterative learning algorithm that can find linear threshold function to partition linearly separable set of points. Assume zero threshold value. 1) w(0) = arbitrary, j=1,

More information

Chapter 4 Concepts from Geometry

Chapter 4 Concepts from Geometry Chapter 4 Concepts from Geometry An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Line Segments The line segment between two points and in R n is the set of points on the straight line joining

More information

14.5 Directional Derivatives and the Gradient Vector

14.5 Directional Derivatives and the Gradient Vector 14.5 Directional Derivatives and the Gradient Vector 1. Directional Derivatives. Recall z = f (x, y) and the partial derivatives f x and f y are defined as f (x 0 + h, y 0 ) f (x 0, y 0 ) f x (x 0, y 0

More information

3.3 Optimizing Functions of Several Variables 3.4 Lagrange Multipliers

3.3 Optimizing Functions of Several Variables 3.4 Lagrange Multipliers 3.3 Optimizing Functions of Several Variables 3.4 Lagrange Multipliers Prof. Tesler Math 20C Fall 2018 Prof. Tesler 3.3 3.4 Optimization Math 20C / Fall 2018 1 / 56 Optimizing y = f (x) In Math 20A, we

More information

Answers to Worksheet 5, Math 272

Answers to Worksheet 5, Math 272 Answers to Worksheet 5, Math 7 1. Calculate the directional derivative of the function f(, y, z) = cos y sin z at the point a = (1, π/, 5π/6) in the direction u = (3, 0, 1). The gradient of f at a is (

More information

Optimizations and Lagrange Multiplier Method

Optimizations and Lagrange Multiplier Method Introduction Applications Goal and Objectives Reflection Questions Once an objective of any real world application is well specified as a function of its control variables, which may subject to a certain

More information

CONLIN & MMA solvers. Pierre DUYSINX LTAS Automotive Engineering Academic year

CONLIN & MMA solvers. Pierre DUYSINX LTAS Automotive Engineering Academic year CONLIN & MMA solvers Pierre DUYSINX LTAS Automotive Engineering Academic year 2018-2019 1 CONLIN METHOD 2 LAY-OUT CONLIN SUBPROBLEMS DUAL METHOD APPROACH FOR CONLIN SUBPROBLEMS SEQUENTIAL QUADRATIC PROGRAMMING

More information

Lecture 5: Properties of convex sets

Lecture 5: Properties of convex sets Lecture 5: Properties of convex sets Rajat Mittal IIT Kanpur This week we will see properties of convex sets. These properties make convex sets special and are the reason why convex optimization problems

More information

Computational Optimization. Constrained Optimization

Computational Optimization. Constrained Optimization Computational Optimization Constrained Optimization Easiest Problem Linear equality constraints min f( ) f R s.. t A b A R, b R n m n m Null Space Representation Let * be a feasible point, A*b. Any other

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Second Order Optimization Methods Marc Toussaint U Stuttgart Planned Outline Gradient-based optimization (1st order methods) plain grad., steepest descent, conjugate grad.,

More information

EXTRA-CREDIT PROBLEMS ON SURFACES, MULTIVARIABLE FUNCTIONS AND PARTIAL DERIVATIVES

EXTRA-CREDIT PROBLEMS ON SURFACES, MULTIVARIABLE FUNCTIONS AND PARTIAL DERIVATIVES EXTRA-CREDIT PROBLEMS ON SURFACES, MULTIVARIABLE FUNCTIONS AND PARTIAL DERIVATIVES A. HAVENS These problems are for extra-credit, which is counted against lost points on quizzes or WebAssign. You do not

More information

Convex Programs. COMPSCI 371D Machine Learning. COMPSCI 371D Machine Learning Convex Programs 1 / 21

Convex Programs. COMPSCI 371D Machine Learning. COMPSCI 371D Machine Learning Convex Programs 1 / 21 Convex Programs COMPSCI 371D Machine Learning COMPSCI 371D Machine Learning Convex Programs 1 / 21 Logistic Regression! Support Vector Machines Support Vector Machines (SVMs) and Convex Programs SVMs are

More information

Unconstrained Optimization

Unconstrained Optimization Unconstrained Optimization Joshua Wilde, revised by Isabel Tecu, Takeshi Suzuki and María José Boccardi August 13, 2013 1 Denitions Economics is a science of optima We maximize utility functions, minimize

More information

Minima, Maxima, Saddle points

Minima, Maxima, Saddle points Minima, Maxima, Saddle points Levent Kandiller Industrial Engineering Department Çankaya University, Turkey Minima, Maxima, Saddle points p./9 Scalar Functions Let us remember the properties for maxima,

More information

Efficient Optimization for L -problems using Pseudoconvexity

Efficient Optimization for L -problems using Pseudoconvexity Efficient Optimization for L -problems using Pseudoconvexity Carl Olsson calle@maths.lth.se Anders P. Eriksson anderspe@maths.lth.se Centre for Mathematical Sciences Lund University, Sweden Fredrik Kahl

More information

Demo 1: KKT conditions with inequality constraints

Demo 1: KKT conditions with inequality constraints MS-C5 Introduction to Optimization Solutions 9 Ehtamo Demo : KKT conditions with inequality constraints Using the Karush-Kuhn-Tucker conditions, see if the points x (x, x ) (, 4) or x (x, x ) (6, ) are

More information

Kernel Methods & Support Vector Machines

Kernel Methods & Support Vector Machines & Support Vector Machines & Support Vector Machines Arvind Visvanathan CSCE 970 Pattern Recognition 1 & Support Vector Machines Question? Draw a single line to separate two classes? 2 & Support Vector

More information

Linear Programming Problems

Linear Programming Problems Linear Programming Problems Two common formulations of linear programming (LP) problems are: min Subject to: 1,,, 1,2,,;, max Subject to: 1,,, 1,2,,;, Linear Programming Problems The standard LP problem

More information

Math 21a Homework 22 Solutions Spring, 2014

Math 21a Homework 22 Solutions Spring, 2014 Math 1a Homework Solutions Spring, 014 1. Based on Stewart 11.8 #6 ) Consider the function fx, y) = e xy, and the constraint x 3 + y 3 = 16. a) Use Lagrange multipliers to find the coordinates x, y) of

More information

Inverse and Implicit functions

Inverse and Implicit functions CHAPTER 3 Inverse and Implicit functions. Inverse Functions and Coordinate Changes Let U R d be a domain. Theorem. (Inverse function theorem). If ϕ : U R d is differentiable at a and Dϕ a is invertible,

More information

Lecture 2 Optimization with equality constraints

Lecture 2 Optimization with equality constraints Lecture 2 Optimization with equality constraints Constrained optimization The idea of constrained optimisation is that the choice of one variable often affects the amount of another variable that can be

More information

Characterizing Improving Directions Unconstrained Optimization

Characterizing Improving Directions Unconstrained Optimization Final Review IE417 In the Beginning... In the beginning, Weierstrass's theorem said that a continuous function achieves a minimum on a compact set. Using this, we showed that for a convex set S and y not

More information

Nonlinear Programming

Nonlinear Programming Nonlinear Programming SECOND EDITION Dimitri P. Bertsekas Massachusetts Institute of Technology WWW site for book Information and Orders http://world.std.com/~athenasc/index.html Athena Scientific, Belmont,

More information

Lecture 9: Linear Programming

Lecture 9: Linear Programming Lecture 9: Linear Programming A common optimization problem involves finding the maximum of a linear function of N variables N Z = a i x i i= 1 (the objective function ) where the x i are all non-negative

More information

CS231A. Review for Problem Set 1. Saumitro Dasgupta

CS231A. Review for Problem Set 1. Saumitro Dasgupta CS231A Review for Problem Set 1 Saumitro Dasgupta On today's menu Camera Model Rotation Matrices Homogeneous Coordinates Vanishing Points Matrix Calculus Constrained Optimization Camera Calibration Demo

More information

California Institute of Technology Crash-Course on Convex Optimization Fall Ec 133 Guilherme Freitas

California Institute of Technology Crash-Course on Convex Optimization Fall Ec 133 Guilherme Freitas California Institute of Technology HSS Division Crash-Course on Convex Optimization Fall 2011-12 Ec 133 Guilherme Freitas In this text, we will study the following basic problem: maximize x C f(x) subject

More information

Math 213 Exam 2. Each question is followed by a space to write your answer. Please write your answer neatly in the space provided.

Math 213 Exam 2. Each question is followed by a space to write your answer. Please write your answer neatly in the space provided. Math 213 Exam 2 Name: Section: Do not remove this answer page you will return the whole exam. You will be allowed two hours to complete this test. No books or notes may be used other than a onepage cheat

More information

Perceptron Learning Algorithm (PLA)

Perceptron Learning Algorithm (PLA) Review: Lecture 4 Perceptron Learning Algorithm (PLA) Learning algorithm for linear threshold functions (LTF) (iterative) Energy function: PLA implements a stochastic gradient algorithm Novikoff s theorem

More information

Kernels and Constrained Optimization

Kernels and Constrained Optimization Machine Learning 1 WS2014 Module IN2064 Sheet 8 Page 1 Machine Learning Worksheet 8 Kernels and Constrained Optimization 1 Kernelized k-nearest neighbours To classify the point x the k-nearest neighbours

More information

Optimization under uncertainty: modeling and solution methods

Optimization under uncertainty: modeling and solution methods Optimization under uncertainty: modeling and solution methods Paolo Brandimarte Dipartimento di Scienze Matematiche Politecnico di Torino e-mail: paolo.brandimarte@polito.it URL: http://staff.polito.it/paolo.brandimarte

More information

MATHS METHODS QUADRATICS REVIEW. A reminder of some of the laws of expansion, which in reverse are a quick reference for rules of factorisation

MATHS METHODS QUADRATICS REVIEW. A reminder of some of the laws of expansion, which in reverse are a quick reference for rules of factorisation MATHS METHODS QUADRATICS REVIEW LAWS OF EXPANSION A reminder of some of the laws of expansion, which in reverse are a quick reference for rules of factorisation a) b) c) d) e) FACTORISING Exercise 4A Q6ace,7acegi

More information

Convexity and Optimization

Convexity and Optimization Convexity and Optimization Richard Lusby DTU Management Engineering Class Exercises From Last Time 2 DTU Management Engineering 42111: Static and Dynamic Optimization (3) 18/09/2017 Today s Material Extrema

More information

Theoretical Concepts of Machine Learning

Theoretical Concepts of Machine Learning Theoretical Concepts of Machine Learning Part 2 Institute of Bioinformatics Johannes Kepler University, Linz, Austria Outline 1 Introduction 2 Generalization Error 3 Maximum Likelihood 4 Noise Models 5

More information

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs Advanced Operations Research Techniques IE316 Quiz 2 Review Dr. Ted Ralphs IE316 Quiz 2 Review 1 Reading for The Quiz Material covered in detail in lecture Bertsimas 4.1-4.5, 4.8, 5.1-5.5, 6.1-6.3 Material

More information

Convex Optimization and Machine Learning

Convex Optimization and Machine Learning Convex Optimization and Machine Learning Mengliu Zhao Machine Learning Reading Group School of Computing Science Simon Fraser University March 12, 2014 Mengliu Zhao SFU-MLRG March 12, 2014 1 / 25 Introduction

More information

Support Vector Machines.

Support Vector Machines. Support Vector Machines srihari@buffalo.edu SVM Discussion Overview 1. Overview of SVMs 2. Margin Geometry 3. SVM Optimization 4. Overlapping Distributions 5. Relationship to Logistic Regression 6. Dealing

More information

MEI GeoGebra Tasks for AS Pure

MEI GeoGebra Tasks for AS Pure Task 1: Coordinate Geometry Intersection of a line and a curve 1. Add a quadratic curve, e.g. y = x 2 4x + 1 2. Add a line, e.g. y = x 3 3. Use the Intersect tool to find the points of intersection of

More information

QEM Optimization, WS 2017/18 Part 4. Constrained optimization

QEM Optimization, WS 2017/18 Part 4. Constrained optimization QEM Optimization, WS 2017/18 Part 4 Constrained optimization (about 4 Lectures) Supporting Literature: Angel de la Fuente, Mathematical Methods and Models for Economists, Chapter 7 Contents 4 Constrained

More information

6. Linear Discriminant Functions

6. Linear Discriminant Functions 6. Linear Discriminant Functions Linear Discriminant Functions Assumption: we know the proper forms for the discriminant functions, and use the samples to estimate the values of parameters of the classifier

More information

Optimization Methods: Optimization using Calculus Kuhn-Tucker Conditions 1. Module - 2 Lecture Notes 5. Kuhn-Tucker Conditions

Optimization Methods: Optimization using Calculus Kuhn-Tucker Conditions 1. Module - 2 Lecture Notes 5. Kuhn-Tucker Conditions Optimization Methods: Optimization using Calculus Kuhn-Tucker Conditions Module - Lecture Notes 5 Kuhn-Tucker Conditions Introduction In the previous lecture the optimization of functions of multiple variables

More information

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs Introduction to Mathematical Programming IE406 Lecture 20 Dr. Ted Ralphs IE406 Lecture 20 1 Reading for This Lecture Bertsimas Sections 10.1, 11.4 IE406 Lecture 20 2 Integer Linear Programming An integer

More information

Applied Lagrange Duality for Constrained Optimization

Applied Lagrange Duality for Constrained Optimization Applied Lagrange Duality for Constrained Optimization Robert M. Freund February 10, 2004 c 2004 Massachusetts Institute of Technology. 1 1 Overview The Practical Importance of Duality Review of Convexity

More information

4 LINEAR PROGRAMMING (LP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

4 LINEAR PROGRAMMING (LP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1 4 LINEAR PROGRAMMING (LP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1 Mathematical programming (optimization) problem: min f (x) s.t. x X R n set of feasible solutions with linear objective function

More information

Lecture 12: Feasible direction methods

Lecture 12: Feasible direction methods Lecture 12 Lecture 12: Feasible direction methods Kin Cheong Sou December 2, 2013 TMA947 Lecture 12 Lecture 12: Feasible direction methods 1 / 1 Feasible-direction methods, I Intro Consider the problem

More information

Lecture 10: SVM Lecture Overview Support Vector Machines The binary classification problem

Lecture 10: SVM Lecture Overview Support Vector Machines The binary classification problem Computational Learning Theory Fall Semester, 2012/13 Lecture 10: SVM Lecturer: Yishay Mansour Scribe: Gitit Kehat, Yogev Vaknin and Ezra Levin 1 10.1 Lecture Overview In this lecture we present in detail

More information

BINARY pattern recognition involves constructing a decision

BINARY pattern recognition involves constructing a decision 1 Incremental Training of Support Vector Machines A Shilton, M Palaniswami, Senior Member, IEEE, D Ralph, and AC Tsoi Senior Member, IEEE, Abstract We propose a new algorithm for the incremental training

More information

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017 Section Notes 5 Review of Linear Programming Applied Math / Engineering Sciences 121 Week of October 15, 2017 The following list of topics is an overview of the material that was covered in the lectures

More information

Bilinear Programming

Bilinear Programming Bilinear Programming Artyom G. Nahapetyan Center for Applied Optimization Industrial and Systems Engineering Department University of Florida Gainesville, Florida 32611-6595 Email address: artyom@ufl.edu

More information

Lec 11 Rate-Distortion Optimization (RDO) in Video Coding-I

Lec 11 Rate-Distortion Optimization (RDO) in Video Coding-I CS/EE 5590 / ENG 401 Special Topics (17804, 17815, 17803) Lec 11 Rate-Distortion Optimization (RDO) in Video Coding-I Zhu Li Course Web: http://l.web.umkc.edu/lizhu/teaching/2016sp.video-communication/main.html

More information

Integer Programming Theory

Integer Programming Theory Integer Programming Theory Laura Galli October 24, 2016 In the following we assume all functions are linear, hence we often drop the term linear. In discrete optimization, we seek to find a solution x

More information

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Search & Optimization Search and Optimization method deals with

More information