Chapter 3 Numerical Methods

Size: px
Start display at page:

Download "Chapter 3 Numerical Methods"

Transcription

1 Chapter 3 Numerical Methods Part Linearization and Optimization of Functions of Vectors 1

2 Problem Notation 2

3 Outline Linearization Optimization of Objective Functions Constrained Optimization Summary 3

4 Motivation A small number of numerical methods occur frequently. Roots of nonlinear equations Optimization Integration of Diff Eqs. Can t use MATLAB to control the robot. You need to know How to implement these. How to cast a problem in standard form. 4

5 Motivation Techniques will be used for control, perception, position estimation, and mapping. Specifically: compute wheel velocities invert dynamic models generate trajectories track features in an image construct globally consistent maps identify dynamic models calibrate cameras 5

6 Outline Linearization Optimization of Objective Functions Constrained Optimization Summary 6

7 Taylor Series (About Any Point) Scalar function of scalar: df x 2 fx ( + x) fx ( ) x dx df x 3 2! dx df = ! dx 3 + Scalar function of vector: Vector function of vector: 1d 1d 1d x 2 1d 2d 3d f x 2 fx ( + x) fx ( ) x x f x 3 2! x f = ! x 3 + 2d x f x 2 2 f x 3 3 f f ( x + x) = f ( x) + x x x 2! x ! x 3 + x x x 2 3d x 3 x 4d 3 x 7

8 Taylor Remainder Theorem fx ( + x) = T n ( x ) + R n ( x) R n ( x ) = ( x + x) x x ζ n! ( n) ( ) n f x n ζ dζ Provided the nth derivative is bounded, the error in a truncated Taylor series is of order ( x) n+1. When x is small, ( x) n+1 is very small. 8

9 Linearization f f ( x + x) = f ( x) + x x x This is accurate to first order meaning the error is second order. 9

10 Gradients and Level Curves and Surfaces This forms an n-1 dimensional subspace of R n called a level surface (a set of level curves). Let the set of solutions be parameterized locally in 1D by the scalar s to form the level curve x(s). Each element of x(s) is a level curve for which g(x) is constant. You could plot x 1 (s) vs x 2 (s)for example. By the chain rule applied along the constraint: g( xs ()) s g x T c = = = x s s 0 T Gradient is normal to all the level curves 10

11 Consider the constraint: Example x 2 The constraint gradient is: R s x 1 Parameterize the level curve with: The level curve tangent: Which is orthogonal to 11

12 Jacobians and Level Surfaces This forms an n-m dimensional subspace of R n called a level surface (a set of level curves). Let the set of solutions be parameterized by the vector s to form x(s). x(s 1 ) and x(s 2 ) are different level curves. By the chain rule: g( xs ()) s g x T c = = = x s s [ 0] Rows of Jacobian are orthogonal to tangents to level curves 12

13 Jacobian and Tangent Plane Consider again: Jacobian g x x T = [ s 0 ] Surface Tangents The set of all linear combinations of the rows of x s is called the tangent plane. Therefore, the constraint tangent plane lies in the nullspace of the constraint Jacobian. 13

14 Outline Linearization Optimization of Objective Functions Constrained Optimization Summary 14

15 3.1.2 Optimization of Objective Functions optimize: x fx ( ) x R n f is always a scalar. Called objective, cost or utility function The max(f) problem is equivalent to min(-f) so we will treat all problems as minimization. 15

16 Convexity and Global Minima A convex (concave) function is uniformly on or below (above) any line between any two points in its domain. For these, local and global minima are the same. Convex Not Convex 16

17 Local Minima Suppose we compute a minimum over a specified region Which one you get often depends on where you start searching. Any global minimum (not on a boundary of the set) must also be a local minimum. Therefore a local minimum is a more fundamental problem. 17

18 Notation This section will henceforth use two forms of shorthand in one compact form: So f x () is a Jacobian or a gradient (if f is scalarvalued): 18

19 First Order (Necessary) Conditions Consider satisfying the weaker condition of being a local minimum. In a given neighborhood, a Taylor series approximation applies. Consider the change f after moving a distance x from a local minimum x*. f = fx ( * + x ) fx ( * ) = fx ( * ) f fx ( * ) = x x = fx ( x * ) x fx ( * ) x x fx ( * ) Objective Gradient 19

20 First Order (Necessary) Conditions From last slide = ( ) x If x* is a local minimum then f 0for an arbitrary x. But if f( x) > 0 then f(- x) would be < 0. Contradiction. Hence we must have equality f f x x * f = f x ( x * ) x = 0 x But since x is arbitrary, we must have: f x ( x * ) = 0 T at a local minimum. 20

21 First Order (Necessary) Conditions Note: The gradient exists in the x1-x2 plane, not in the x1-x2-f volume. 21

22 Second Order (Sufficient) Conditions At a local minimum, a perturbation in any direction satisfies: fx ( * + x) = fx ( * ) + f x ( x * ) x = fx ( * ) x Therefore, the order 2 Taylor series has no linear term at this point: Instead, it looks like: fx ( + x ) fx ( ) + 2 x 2 f ! x 2 Implicit = Layout x 22

23 Second Order (Sufficient) Conditions In explicit vector layout this is: f = fx ( + x) fx ( ) = fx ( ) So the optimality condition is: f = 1 -- x T F 2 xx ( x) x 0 x In other words, the Hessian must be positive semi-definite. The sufficient condition is: x T F 2 xx ( x) x x T F xx ( x) x > 0 Hessian Must Be Positive Definite 23

24 Feasible Points Consider a set of m nonlinear constraints on the n elements of x. cx ( ) = 0 c R m x R n Any point that satisfies these is called a feasible point. Suppose the point x is feasible. 24

25 Feasible Perturbations Consider a feasible perturbation x That is, one which remains feasible to first order. c( x' + x) = c( x' ) + c x ( x' ) x = 0 The Jacobian c x is nonsquare m by n. Since x is feasible, c(x ) =0. Hence: c x ( x' ) x = 0 For a feasible perturbation from a feasible point. 25

26 From last slide Constraint Nullspace c x ( x' ) x = 0 The Jacobian is composed of m gradients (rows): { c 1 ( x ), c 2 ( x) c m ( x) } The feasibility condition is requiring the perturbation to be in the nullspace of the constraints. N( c x ) = { x: c x ( x' ) x = 0} 26

27 Constraint Tangent Plane The constraint nullspace is also known as the constraint tangent plane. 27

28 Conditions for Optimality and Feasibility 28

29 Outline Linearization Optimization of Objective Functions Constrained Optimization Summary 29

30 3.1.3 Constrained Optimization optimize: x fx ( ) x R n subject to: cx ( ) = 0 c R m n variables in f() subject to m<n constraints. Also known as nonlinear programming. If there is no f(), it is constraint satisfaction or rootfinding. If there are no constraints it is unconstrained optimization. 30

31 First Order (Necessary) Conditions There are n-m degrees of freedom left after the constraints are enforced. Cannot just set objective gradient to zero because such a point may not be feasible. Define a local minimum to mean for a feasible perturbation from a feasible point the objective must not increase c x ( x * ) x = 0 x feasible f x ( x * ) x = 0 x feasible 31

32 Lagrange Multipliers First, a feasible perturbation lies in the constraint tangent plane. c x ( x * ) x = 0 x feasible Second, the objective gradient is orthogonal to an arbitrary vector in the tangent plane. f x ( x * ) x = 0 x feasible The rows of c x ( x * ) span the space of all vectors orthogonal to the constraints. Hence, f x () must be some linear combination of them: f x ( x * ) = λ T c x ( x * ) 32

33 Constrained Local Minimum f 3 f 4 c(x)=0 Moving left or right on the constraint increases the objective The objective gradient is orthogonal to the level curves of both the constraints and the objective. Level curves of constraints and objective are mutually tangent. 33

34 Necessary Conditions Reverse the sign of the (unknown) multipliers to get: f T x ( x * ) + c T x ( x * )λ = 0 cx ( ) = 0 It is customary to define the Lagrangian: lxλ (, ) = fx ( ) + λ T cx ( ) Eqn A 34

35 Define the Lagrangian: Necessary Conditions lxλ (, ) = fx ( ) + λ T cx ( ) The necessary conditions (Eqn A) can then be written as: l( xλ, ) T = 0 x l( xλ, ) T = 0 λ Necessary Conditions For A Constrained Local Mimimum 35

36 Solving NLP Problems Four basic techniques: Substitution: Substitute constraints into objective. Solve resulting n-m dof unconstrained problem. Descent: Follow negative gradient of f() while remaining feasible. Lagrange Multipliers: Solve n+m (linearized) necessary conditions directly. Penalty Function: Convert constraints to a cost and solve resulting n+m dof unconstrained problem. 36

37 Example: Penalty Function Consider the problem: Suppose constraints are not satisfied and define the residual: Define a weight matrix R and form the new cost function: 37

38 From last slide: Example: Penalty Function Differentiate f(x) wrt x and solve: Hence: Solution Depends on Weight Matrix 38

39 Outline Linearization Optimization of Objective Functions Constrained Optimization Summary 39

40 Summary Linearization, derived from the Taylor series, is the fundamental basis of many numerical methods. The gradient vanishes at a local minimum of an unconstrained objective. All the gradient(s) of a set of constraints are orthogonal to every vector in the constraint tangent plane. 40

41 Summary A constrained minimum occurs when the objective tangent plane is tangent to the constraint tangent plane. Equivalently, when the objective gradient is a linear combination of the constraint gradients. The Lagrange multipliers are a set of unknown projections of a vector onto a set of constraint gradients (disallowed directions). 41

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach Basic approaches I. Primal Approach - Feasible Direction

More information

Local and Global Minimum

Local and Global Minimum Local and Global Minimum Stationary Point. From elementary calculus, a single variable function has a stationary point at if the derivative vanishes at, i.e., 0. Graphically, the slope of the function

More information

(1) Given the following system of linear equations, which depends on a parameter a R, 3x y + 5z = 2 4x + y + (a 2 14)z = a + 2

(1) Given the following system of linear equations, which depends on a parameter a R, 3x y + 5z = 2 4x + y + (a 2 14)z = a + 2 (1 Given the following system of linear equations, which depends on a parameter a R, x + 2y 3z = 4 3x y + 5z = 2 4x + y + (a 2 14z = a + 2 (a Classify the system of equations depending on the values of

More information

Inverse and Implicit functions

Inverse and Implicit functions CHAPTER 3 Inverse and Implicit functions. Inverse Functions and Coordinate Changes Let U R d be a domain. Theorem. (Inverse function theorem). If ϕ : U R d is differentiable at a and Dϕ a is invertible,

More information

Chapter 4 Dynamics. Part Constrained Kinematics and Dynamics. Mobile Robotics - Prof Alonzo Kelly, CMU RI

Chapter 4 Dynamics. Part Constrained Kinematics and Dynamics. Mobile Robotics - Prof Alonzo Kelly, CMU RI Chapter 4 Dynamics Part 2 4.3 Constrained Kinematics and Dynamics 1 Outline 4.3 Constrained Kinematics and Dynamics 4.3.1 Constraints of Disallowed Direction 4.3.2 Constraints of Rolling without Slipping

More information

Lagrange multipliers October 2013

Lagrange multipliers October 2013 Lagrange multipliers 14.8 14 October 2013 Example: Optimization with constraint. Example: Find the extreme values of f (x, y) = x + 2y on the ellipse 3x 2 + 4y 2 = 3. 3/2 1 1 3/2 Example: Optimization

More information

Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers

Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers In this section we present Lagrange s method for maximizing or minimizing a general function f(x, y, z) subject to a constraint (or side condition) of the form g(x, y, z) = k. Figure 1 shows this curve

More information

Computational Methods. Constrained Optimization

Computational Methods. Constrained Optimization Computational Methods Constrained Optimization Manfred Huber 2010 1 Constrained Optimization Unconstrained Optimization finds a minimum of a function under the assumption that the parameters can take on

More information

CS231A. Review for Problem Set 1. Saumitro Dasgupta

CS231A. Review for Problem Set 1. Saumitro Dasgupta CS231A Review for Problem Set 1 Saumitro Dasgupta On today's menu Camera Model Rotation Matrices Homogeneous Coordinates Vanishing Points Matrix Calculus Constrained Optimization Camera Calibration Demo

More information

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited. page v Preface xiii I Basics 1 1 Optimization Models 3 1.1 Introduction... 3 1.2 Optimization: An Informal Introduction... 4 1.3 Linear Equations... 7 1.4 Linear Optimization... 10 Exercises... 12 1.5

More information

Lagrange multipliers 14.8

Lagrange multipliers 14.8 Lagrange multipliers 14.8 14 October 2013 Example: Optimization with constraint. Example: Find the extreme values of f (x, y) = x + 2y on the ellipse 3x 2 + 4y 2 = 3. 3/2 Maximum? 1 1 Minimum? 3/2 Idea:

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June

More information

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods 1 Nonlinear Programming Types of Nonlinear Programs (NLP) Convexity and Convex Programs NLP Solutions Unconstrained Optimization Principles of Unconstrained Optimization Search Methods Constrained Optimization

More information

. Tutorial Class V 3-10/10/2012 First Order Partial Derivatives;...

. Tutorial Class V 3-10/10/2012 First Order Partial Derivatives;... Tutorial Class V 3-10/10/2012 1 First Order Partial Derivatives; Tutorial Class V 3-10/10/2012 1 First Order Partial Derivatives; 2 Application of Gradient; Tutorial Class V 3-10/10/2012 1 First Order

More information

Math 241, Final Exam. 12/11/12.

Math 241, Final Exam. 12/11/12. Math, Final Exam. //. No notes, calculator, or text. There are points total. Partial credit may be given. ircle or otherwise clearly identify your final answer. Name:. (5 points): Equation of a line. Find

More information

Machine Learning for Signal Processing Lecture 4: Optimization

Machine Learning for Signal Processing Lecture 4: Optimization Machine Learning for Signal Processing Lecture 4: Optimization 13 Sep 2015 Instructor: Bhiksha Raj (slides largely by Najim Dehak, JHU) 11-755/18-797 1 Index 1. The problem of optimization 2. Direct optimization

More information

21-256: Lagrange multipliers

21-256: Lagrange multipliers 21-256: Lagrange multipliers Clive Newstead, Thursday 12th June 2014 Lagrange multipliers give us a means of optimizing multivariate functions subject to a number of constraints on their variables. Problems

More information

Quasilinear First-Order PDEs

Quasilinear First-Order PDEs MODULE 2: FIRST-ORDER PARTIAL DIFFERENTIAL EQUATIONS 16 Lecture 3 Quasilinear First-Order PDEs A first order quasilinear PDE is of the form a(x, y, z) + b(x, y, z) x y = c(x, y, z). (1) Such equations

More information

Key points. Assume (except for point 4) f : R n R is twice continuously differentiable. strictly above the graph of f except at x

Key points. Assume (except for point 4) f : R n R is twice continuously differentiable. strictly above the graph of f except at x Key points Assume (except for point 4) f : R n R is twice continuously differentiable 1 If Hf is neg def at x, then f attains a strict local max at x iff f(x) = 0 In (1), replace Hf(x) negative definite

More information

Minima, Maxima, Saddle points

Minima, Maxima, Saddle points Minima, Maxima, Saddle points Levent Kandiller Industrial Engineering Department Çankaya University, Turkey Minima, Maxima, Saddle points p./9 Scalar Functions Let us remember the properties for maxima,

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Second Order Optimization Methods Marc Toussaint U Stuttgart Planned Outline Gradient-based optimization (1st order methods) plain grad., steepest descent, conjugate grad.,

More information

Characterizing Improving Directions Unconstrained Optimization

Characterizing Improving Directions Unconstrained Optimization Final Review IE417 In the Beginning... In the beginning, Weierstrass's theorem said that a continuous function achieves a minimum on a compact set. Using this, we showed that for a convex set S and y not

More information

In other words, we want to find the domain points that yield the maximum or minimum values (extrema) of the function.

In other words, we want to find the domain points that yield the maximum or minimum values (extrema) of the function. 1 The Lagrange multipliers is a mathematical method for performing constrained optimization of differentiable functions. Recall unconstrained optimization of differentiable functions, in which we want

More information

REVIEW I MATH 254 Calculus IV. Exam I (Friday, April 29) will cover sections

REVIEW I MATH 254 Calculus IV. Exam I (Friday, April 29) will cover sections REVIEW I MATH 254 Calculus IV Exam I (Friday, April 29 will cover sections 14.1-8. 1. Functions of multivariables The definition of multivariable functions is similar to that of functions of one variable.

More information

Lecture 2 Optimization with equality constraints

Lecture 2 Optimization with equality constraints Lecture 2 Optimization with equality constraints Constrained optimization The idea of constrained optimisation is that the choice of one variable often affects the amount of another variable that can be

More information

Lecture 25 Nonlinear Programming. November 9, 2009

Lecture 25 Nonlinear Programming. November 9, 2009 Nonlinear Programming November 9, 2009 Outline Nonlinear Programming Another example of NLP problem What makes these problems complex Scalar Function Unconstrained Problem Local and global optima: definition,

More information

Solution Methods Numerical Algorithms

Solution Methods Numerical Algorithms Solution Methods Numerical Algorithms Evelien van der Hurk DTU Managment Engineering Class Exercises From Last Time 2 DTU Management Engineering 42111: Static and Dynamic Optimization (6) 09/10/2017 Class

More information

Introduction to Optimization Problems and Methods

Introduction to Optimization Problems and Methods Introduction to Optimization Problems and Methods wjch@umich.edu December 10, 2009 Outline 1 Linear Optimization Problem Simplex Method 2 3 Cutting Plane Method 4 Discrete Dynamic Programming Problem Simplex

More information

Support Vector Machines.

Support Vector Machines. Support Vector Machines srihari@buffalo.edu SVM Discussion Overview 1. Overview of SVMs 2. Margin Geometry 3. SVM Optimization 4. Overlapping Distributions 5. Relationship to Logistic Regression 6. Dealing

More information

Animation Lecture 10 Slide Fall 2003

Animation Lecture 10 Slide Fall 2003 Animation Lecture 10 Slide 1 6.837 Fall 2003 Conventional Animation Draw each frame of the animation great control tedious Reduce burden with cel animation layer keyframe inbetween cel panoramas (Disney

More information

Section 4: Extreme Values & Lagrange Multipliers.

Section 4: Extreme Values & Lagrange Multipliers. Section 4: Extreme Values & Lagrange Multipliers. Compiled by Chris Tisdell S1: Motivation S2: What are local maxima & minima? S3: What is a critical point? S4: Second derivative test S5: Maxima and Minima

More information

CONLIN & MMA solvers. Pierre DUYSINX LTAS Automotive Engineering Academic year

CONLIN & MMA solvers. Pierre DUYSINX LTAS Automotive Engineering Academic year CONLIN & MMA solvers Pierre DUYSINX LTAS Automotive Engineering Academic year 2018-2019 1 CONLIN METHOD 2 LAY-OUT CONLIN SUBPROBLEMS DUAL METHOD APPROACH FOR CONLIN SUBPROBLEMS SEQUENTIAL QUADRATIC PROGRAMMING

More information

1. Suppose that the equation F (x, y, z) = 0 implicitly defines each of the three variables x, y, and z as functions of the other two:

1. Suppose that the equation F (x, y, z) = 0 implicitly defines each of the three variables x, y, and z as functions of the other two: Final Solutions. Suppose that the equation F (x, y, z) implicitly defines each of the three variables x, y, and z as functions of the other two: z f(x, y), y g(x, z), x h(y, z). If F is differentiable

More information

All lecture slides will be available at CSC2515_Winter15.html

All lecture slides will be available at  CSC2515_Winter15.html CSC2515 Fall 2015 Introduc3on to Machine Learning Lecture 9: Support Vector Machines All lecture slides will be available at http://www.cs.toronto.edu/~urtasun/courses/csc2515/ CSC2515_Winter15.html Many

More information

Ellipsoid Algorithm :Algorithms in the Real World. Ellipsoid Algorithm. Reduction from general case

Ellipsoid Algorithm :Algorithms in the Real World. Ellipsoid Algorithm. Reduction from general case Ellipsoid Algorithm 15-853:Algorithms in the Real World Linear and Integer Programming II Ellipsoid algorithm Interior point methods First polynomial-time algorithm for linear programming (Khachian 79)

More information

5 Day 5: Maxima and minima for n variables.

5 Day 5: Maxima and minima for n variables. UNIVERSITAT POMPEU FABRA INTERNATIONAL BUSINESS ECONOMICS MATHEMATICS III. Pelegrí Viader. 2012-201 Updated May 14, 201 5 Day 5: Maxima and minima for n variables. The same kind of first-order and second-order

More information

Chapter 6. Curves and Surfaces. 6.1 Graphs as Surfaces

Chapter 6. Curves and Surfaces. 6.1 Graphs as Surfaces Chapter 6 Curves and Surfaces In Chapter 2 a plane is defined as the zero set of a linear function in R 3. It is expected a surface is the zero set of a differentiable function in R n. To motivate, graphs

More information

Constrained Optimization

Constrained Optimization Constrained Optimization Dudley Cooke Trinity College Dublin Dudley Cooke (Trinity College Dublin) Constrained Optimization 1 / 46 EC2040 Topic 5 - Constrained Optimization Reading 1 Chapters 12.1-12.3

More information

Constrained Optimization and Lagrange Multipliers

Constrained Optimization and Lagrange Multipliers Constrained Optimization and Lagrange Multipliers MATH 311, Calculus III J. Robert Buchanan Department of Mathematics Fall 2011 Constrained Optimization In the previous section we found the local or absolute

More information

Background for Surface Integration

Background for Surface Integration Background for urface Integration 1 urface Integrals We have seen in previous work how to define and compute line integrals in R 2. You should remember the basic surface integrals that we will need to

More information

Lecture 2 September 3

Lecture 2 September 3 EE 381V: Large Scale Optimization Fall 2012 Lecture 2 September 3 Lecturer: Caramanis & Sanghavi Scribe: Hongbo Si, Qiaoyang Ye 2.1 Overview of the last Lecture The focus of the last lecture was to give

More information

MATH2111 Higher Several Variable Calculus Lagrange Multipliers

MATH2111 Higher Several Variable Calculus Lagrange Multipliers MATH2111 Higher Several Variable Calculus Lagrange Multipliers Dr. Jonathan Kress School of Mathematics and Statistics University of New South Wales Semester 1, 2016 [updated: February 29, 2016] JM Kress

More information

Computational Optimization. Constrained Optimization Algorithms

Computational Optimization. Constrained Optimization Algorithms Computational Optimization Constrained Optimization Algorithms Same basic algorithms Repeat Determine descent direction Determine step size Take a step Until Optimal But now must consider feasibility,

More information

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

More information

Optimization Methods: Optimization using Calculus-Stationary Points 1. Module - 2 Lecture Notes 1

Optimization Methods: Optimization using Calculus-Stationary Points 1. Module - 2 Lecture Notes 1 Optimization Methods: Optimization using Calculus-Stationary Points 1 Module - Lecture Notes 1 Stationary points: Functions of Single and Two Variables Introduction In this session, stationary points of

More information

EXTRA-CREDIT PROBLEMS ON SURFACES, MULTIVARIABLE FUNCTIONS AND PARTIAL DERIVATIVES

EXTRA-CREDIT PROBLEMS ON SURFACES, MULTIVARIABLE FUNCTIONS AND PARTIAL DERIVATIVES EXTRA-CREDIT PROBLEMS ON SURFACES, MULTIVARIABLE FUNCTIONS AND PARTIAL DERIVATIVES A. HAVENS These problems are for extra-credit, which is counted against lost points on quizzes or WebAssign. You do not

More information

INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING

INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING DAVID G. LUENBERGER Stanford University TT ADDISON-WESLEY PUBLISHING COMPANY Reading, Massachusetts Menlo Park, California London Don Mills, Ontario CONTENTS

More information

Constrained and Unconstrained Optimization

Constrained and Unconstrained Optimization Constrained and Unconstrained Optimization Carlos Hurtado Department of Economics University of Illinois at Urbana-Champaign hrtdmrt2@illinois.edu Oct 10th, 2017 C. Hurtado (UIUC - Economics) Numerical

More information

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview Arun Das 05/09/2017 Arun Das Waterloo Autonomous Vehicles Lab Introduction What s in a name? Arun Das Waterloo Autonomous

More information

Bounded, Closed, and Compact Sets

Bounded, Closed, and Compact Sets Bounded, Closed, and Compact Sets Definition Let D be a subset of R n. Then D is said to be bounded if there is a number M > 0 such that x < M for all x D. D is closed if it contains all the boundary points.

More information

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING KELLER VANDEBOGERT AND CHARLES LANNING 1. Introduction Interior point methods are, put simply, a technique of optimization where, given a problem

More information

COMS 4771 Support Vector Machines. Nakul Verma

COMS 4771 Support Vector Machines. Nakul Verma COMS 4771 Support Vector Machines Nakul Verma Last time Decision boundaries for classification Linear decision boundary (linear classification) The Perceptron algorithm Mistake bound for the perceptron

More information

Lagrangian Multipliers

Lagrangian Multipliers Università Ca Foscari di Venezia - Dipartimento di Management - A.A.2017-2018 Mathematics Lagrangian Multipliers Luciano Battaia November 15, 2017 1 Two variables functions and constraints Consider a two

More information

Multivariate Calculus Review Problems for Examination Two

Multivariate Calculus Review Problems for Examination Two Multivariate Calculus Review Problems for Examination Two Note: Exam Two is on Thursday, February 28, class time. The coverage is multivariate differential calculus and double integration: sections 13.3,

More information

Solution 2. ((3)(1) (2)(1), (4 3), (4)(2) (3)(3)) = (1, 1, 1) D u (f) = (6x + 2yz, 2y + 2xz, 2xy) (0,1,1) = = 4 14

Solution 2. ((3)(1) (2)(1), (4 3), (4)(2) (3)(3)) = (1, 1, 1) D u (f) = (6x + 2yz, 2y + 2xz, 2xy) (0,1,1) = = 4 14 Vector and Multivariable Calculus L Marizza A Bailey Practice Trimester Final Exam Name: Problem 1. To prepare for true/false and multiple choice: Compute the following (a) (4, 3) ( 3, 2) Solution 1. (4)(

More information

Convexization in Markov Chain Monte Carlo

Convexization in Markov Chain Monte Carlo in Markov Chain Monte Carlo 1 IBM T. J. Watson Yorktown Heights, NY 2 Department of Aerospace Engineering Technion, Israel August 23, 2011 Problem Statement MCMC processes in general are governed by non

More information

ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints

ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints Instructor: Kevin Ross Scribe: Pritam Roy May 0, 2005 Outline of topics for the lecture We will discuss

More information

x 6 + λ 2 x 6 = for the curve y = 1 2 x3 gives f(1, 1 2 ) = λ actually has another solution besides λ = 1 2 = However, the equation λ

x 6 + λ 2 x 6 = for the curve y = 1 2 x3 gives f(1, 1 2 ) = λ actually has another solution besides λ = 1 2 = However, the equation λ Math 0 Prelim I Solutions Spring 010 1. Let f(x, y) = x3 y for (x, y) (0, 0). x 6 + y (4 pts) (a) Show that the cubic curves y = x 3 are level curves of the function f. Solution. Substituting y = x 3 in

More information

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs 15.082J and 6.855J Lagrangian Relaxation 2 Algorithms Application to LPs 1 The Constrained Shortest Path Problem (1,10) 2 (1,1) 4 (2,3) (1,7) 1 (10,3) (1,2) (10,1) (5,7) 3 (12,3) 5 (2,2) 6 Find the shortest

More information

Lecture Notes: Constraint Optimization

Lecture Notes: Constraint Optimization Lecture Notes: Constraint Optimization Gerhard Neumann January 6, 2015 1 Constraint Optimization Problems In constraint optimization we want to maximize a function f(x) under the constraints that g i (x)

More information

Calculus III Meets the Final

Calculus III Meets the Final Calculus III Meets the Final Peter A. Perry University of Kentucky December 7, 2018 Homework Review for Final Exam on Thursday, December 13, 6:00-8:00 PM Be sure you know which room to go to for the final!

More information

Directional Derivatives. Directional Derivatives. Directional Derivatives. Directional Derivatives. Directional Derivatives. Directional Derivatives

Directional Derivatives. Directional Derivatives. Directional Derivatives. Directional Derivatives. Directional Derivatives. Directional Derivatives Recall that if z = f(x, y), then the partial derivatives f x and f y are defined as and represent the rates of change of z in the x- and y-directions, that is, in the directions of the unit vectors i and

More information

Chapter 5 Partial Differentiation

Chapter 5 Partial Differentiation Chapter 5 Partial Differentiation For functions of one variable, y = f (x), the rate of change of the dependent variable can dy be found unambiguously by differentiation: f x. In this chapter we explore

More information

Jacobian: Velocities and Static Forces 1/4

Jacobian: Velocities and Static Forces 1/4 Jacobian: Velocities and Static Forces /4 Advanced Robotic - MAE 6D - Department of Mechanical & Aerospace Engineering - UCLA Kinematics Relations - Joint & Cartesian Spaces A robot is often used to manipulate

More information

AP Calculus BC Course Description

AP Calculus BC Course Description AP Calculus BC Course Description COURSE OUTLINE: The following topics define the AP Calculus BC course as it is taught over three trimesters, each consisting of twelve week grading periods. Limits and

More information

Outcomes List for Math Multivariable Calculus (9 th edition of text) Spring

Outcomes List for Math Multivariable Calculus (9 th edition of text) Spring Outcomes List for Math 200-200935 Multivariable Calculus (9 th edition of text) Spring 2009-2010 The purpose of the Outcomes List is to give you a concrete summary of the material you should know, and

More information

Jacobian: Velocities and Static Forces 1/4

Jacobian: Velocities and Static Forces 1/4 Jacobian: Velocities and Static Forces /4 Models of Robot Manipulation - EE 54 - Department of Electrical Engineering - University of Washington Kinematics Relations - Joint & Cartesian Spaces A robot

More information

Math 213 Exam 2. Each question is followed by a space to write your answer. Please write your answer neatly in the space provided.

Math 213 Exam 2. Each question is followed by a space to write your answer. Please write your answer neatly in the space provided. Math 213 Exam 2 Name: Section: Do not remove this answer page you will return the whole exam. You will be allowed two hours to complete this test. No books or notes may be used other than a onepage cheat

More information

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP):

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP): Linköping University Optimization TAOP3(0) Department of Mathematics Examination Oleg Burdakov of 30 October 03 Assignment Consider the following linear programming problem (LP): max z = x + x s.t. x x

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 2 Convex Optimization Shiqian Ma, MAT-258A: Numerical Optimization 2 2.1. Convex Optimization General optimization problem: min f 0 (x) s.t., f i

More information

Lecture 5: Duality Theory

Lecture 5: Duality Theory Lecture 5: Duality Theory Rajat Mittal IIT Kanpur The objective of this lecture note will be to learn duality theory of linear programming. We are planning to answer following questions. What are hyperplane

More information

A Short SVM (Support Vector Machine) Tutorial

A Short SVM (Support Vector Machine) Tutorial A Short SVM (Support Vector Machine) Tutorial j.p.lewis CGIT Lab / IMSC U. Southern California version 0.zz dec 004 This tutorial assumes you are familiar with linear algebra and equality-constrained optimization/lagrange

More information

Unconstrained Optimization

Unconstrained Optimization Unconstrained Optimization Joshua Wilde, revised by Isabel Tecu, Takeshi Suzuki and María José Boccardi August 13, 2013 1 Denitions Economics is a science of optima We maximize utility functions, minimize

More information

we wish to minimize this function; to make life easier, we may minimize

we wish to minimize this function; to make life easier, we may minimize Optimization and Lagrange Multipliers We studied single variable optimization problems in Calculus 1; given a function f(x), we found the extremes of f relative to some constraint. Our ability to find

More information

Let and be a differentiable function. Let Then be the level surface given by

Let and be a differentiable function. Let Then be the level surface given by Module 12 : Total differential, Tangent planes and normals Lecture 35 : Tangent plane and normal [Section 35.1] > Objectives In this section you will learn the following : The notion tangent plane to a

More information

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer David G. Luenberger Yinyu Ye Linear and Nonlinear Programming Fourth Edition ö Springer Contents 1 Introduction 1 1.1 Optimization 1 1.2 Types of Problems 2 1.3 Size of Problems 5 1.4 Iterative Algorithms

More information

Convexity and Optimization

Convexity and Optimization Convexity and Optimization Richard Lusby Department of Management Engineering Technical University of Denmark Today s Material Extrema Convex Function Convex Sets Other Convexity Concepts Unconstrained

More information

Chapter 4 Concepts from Geometry

Chapter 4 Concepts from Geometry Chapter 4 Concepts from Geometry An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Line Segments The line segment between two points and in R n is the set of points on the straight line joining

More information

MATH Lagrange multipliers in 3 variables Fall 2016

MATH Lagrange multipliers in 3 variables Fall 2016 MATH 20550 Lagrange multipliers in 3 variables Fall 2016 1. The one constraint they The problem is to find the extrema of a function f(x, y, z) subject to the constraint g(x, y, z) = c. The book gives

More information

Chapter 3: Kinematics Locomotion. Ross Hatton and Howie Choset

Chapter 3: Kinematics Locomotion. Ross Hatton and Howie Choset Chapter 3: Kinematics Locomotion Ross Hatton and Howie Choset 1 (Fully/Under)Actuated Fully Actuated Control all of the DOFs of the system Controlling the joint angles completely specifies the configuration

More information

Optimal Control Techniques for Dynamic Walking

Optimal Control Techniques for Dynamic Walking Optimal Control Techniques for Dynamic Walking Optimization in Robotics & Biomechanics IWR, University of Heidelberg Presentation partly based on slides by Sebastian Sager, Moritz Diehl and Peter Riede

More information

1 Vector Functions and Space Curves

1 Vector Functions and Space Curves ontents 1 Vector Functions and pace urves 2 1.1 Limits, Derivatives, and Integrals of Vector Functions...................... 2 1.2 Arc Length and urvature..................................... 2 1.3 Motion

More information

Kernel Methods & Support Vector Machines

Kernel Methods & Support Vector Machines & Support Vector Machines & Support Vector Machines Arvind Visvanathan CSCE 970 Pattern Recognition 1 & Support Vector Machines Question? Draw a single line to separate two classes? 2 & Support Vector

More information

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu FMA901F: Machine Learning Lecture 3: Linear Models for Regression Cristian Sminchisescu Machine Learning: Frequentist vs. Bayesian In the frequentist setting, we seek a fixed parameter (vector), with value(s)

More information

MTAEA Convexity and Quasiconvexity

MTAEA Convexity and Quasiconvexity School of Economics, Australian National University February 19, 2010 Convex Combinations and Convex Sets. Definition. Given any finite collection of points x 1,..., x m R n, a point z R n is said to be

More information

Lagrangian methods for the regularization of discrete ill-posed problems. G. Landi

Lagrangian methods for the regularization of discrete ill-posed problems. G. Landi Lagrangian methods for the regularization of discrete ill-posed problems G. Landi Abstract In many science and engineering applications, the discretization of linear illposed problems gives rise to large

More information

What you will learn today

What you will learn today What you will learn today Tangent Planes and Linear Approximation and the Gradient Vector Vector Functions 1/21 Recall in one-variable calculus, as we zoom in toward a point on a curve, the graph becomes

More information

Convexity and Optimization

Convexity and Optimization Convexity and Optimization Richard Lusby DTU Management Engineering Class Exercises From Last Time 2 DTU Management Engineering 42111: Static and Dynamic Optimization (3) 18/09/2017 Today s Material Extrema

More information

Linear Methods for Regression and Shrinkage Methods

Linear Methods for Regression and Shrinkage Methods Linear Methods for Regression and Shrinkage Methods Reference: The Elements of Statistical Learning, by T. Hastie, R. Tibshirani, J. Friedman, Springer 1 Linear Regression Models Least Squares Input vectors

More information

Math 113 Calculus III Final Exam Practice Problems Spring 2003

Math 113 Calculus III Final Exam Practice Problems Spring 2003 Math 113 Calculus III Final Exam Practice Problems Spring 23 1. Let g(x, y, z) = 2x 2 + y 2 + 4z 2. (a) Describe the shapes of the level surfaces of g. (b) In three different graphs, sketch the three cross

More information

Winter 2012 Math 255 Section 006. Problem Set 7

Winter 2012 Math 255 Section 006. Problem Set 7 Problem Set 7 1 a) Carry out the partials with respect to t and x, substitute and check b) Use separation of varibles, i.e. write as dx/x 2 = dt, integrate both sides and observe that the solution also

More information

Introduction to Modern Control Systems

Introduction to Modern Control Systems Introduction to Modern Control Systems Convex Optimization, Duality and Linear Matrix Inequalities Kostas Margellos University of Oxford AIMS CDT 2016-17 Introduction to Modern Control Systems November

More information

Linear Transformations

Linear Transformations Linear Transformations The two basic vector operations are addition and scaling From this perspective, the nicest functions are those which preserve these operations: Def: A linear transformation is a

More information

10/11/07 1. Motion Control (wheeled robots) Representing Robot Position ( ) ( ) [ ] T

10/11/07 1. Motion Control (wheeled robots) Representing Robot Position ( ) ( ) [ ] T 3 3 Motion Control (wheeled robots) Introduction: Mobile Robot Kinematics Requirements for Motion Control Kinematic / dynamic model of the robot Model of the interaction between the wheel and the ground

More information

MATH 2400, Analytic Geometry and Calculus 3

MATH 2400, Analytic Geometry and Calculus 3 MATH 2400, Analytic Geometry and Calculus 3 List of important Definitions and Theorems 1 Foundations Definition 1. By a function f one understands a mathematical object consisting of (i) a set X, called

More information

Lagrange multipliers. Contents. Introduction. From Wikipedia, the free encyclopedia

Lagrange multipliers. Contents. Introduction. From Wikipedia, the free encyclopedia Lagrange multipliers From Wikipedia, the free encyclopedia In mathematical optimization problems, Lagrange multipliers, named after Joseph Louis Lagrange, is a method for finding the local extrema of a

More information

Optimizations and Lagrange Multiplier Method

Optimizations and Lagrange Multiplier Method Introduction Applications Goal and Objectives Reflection Questions Once an objective of any real world application is well specified as a function of its control variables, which may subject to a certain

More information

Fast marching methods

Fast marching methods 1 Fast marching methods Lecture 3 Alexander & Michael Bronstein tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009 Metric discretization 2 Approach I:

More information

Chapter 15 Introduction to Linear Programming

Chapter 15 Introduction to Linear Programming Chapter 15 Introduction to Linear Programming An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Brief History of Linear Programming The goal of linear programming is to determine the values of

More information

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches Reminder: Lecture 20: The Eight-Point Algorithm F = -0.00310695-0.0025646 2.96584-0.028094-0.00771621 56.3813 13.1905-29.2007-9999.79 Readings T&V 7.3 and 7.4 Essential/Fundamental Matrix E/F Matrix Summary

More information