minimize ½(x 1 + 2x 2 + x 32 ) subject to x 1 + x 2 + x 3 = 5

Similar documents
5 Day 5: Maxima and minima for n variables.

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods

Applied Lagrange Duality for Constrained Optimization

Lecture 2 Optimization with equality constraints

Chapter II. Linear Programming

Chapter 7 page 1 MATH 105 FUNCTIONS OF SEVERAL VARIABLES

Lagrange Multipliers

EC5555 Economics Masters Refresher Course in Mathematics September Lecture 6 Optimization with equality constraints Francesco Feri

(1) Given the following system of linear equations, which depends on a parameter a R, 3x y + 5z = 2 4x + y + (a 2 14)z = a + 2

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs

Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach

A Short SVM (Support Vector Machine) Tutorial

Characterizing Improving Directions Unconstrained Optimization

minimise f(x) subject to g(x) = b, x X. inf f(x) = inf L(x,

Lagrange multipliers. Contents. Introduction. From Wikipedia, the free encyclopedia

Computational Methods. Constrained Optimization

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP):

we wish to minimize this function; to make life easier, we may minimize

Unconstrained and Constrained Optimization

Data Mining: Concepts and Techniques. Chapter 9 Classification: Support Vector Machines. Support Vector Machines (SVMs)

loose-leaf paper Name: Class: Date: Multiple Choice Identify the choice that best completes the statement or answers the question.

California Institute of Technology Crash-Course on Convex Optimization Fall Ec 133 Guilherme Freitas

Introduction to Constrained Optimization

Fast-Lipschitz Optimization

Sample tasks from: Algebra Assessments Through the Common Core (Grades 6-12)

Solution Methods Numerical Algorithms

Kernels and Constrained Optimization

21-256: Lagrange multipliers

Demo 1: KKT conditions with inequality constraints

ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints

Convex Optimization and Machine Learning

MATH 19520/51 Class 10

Bounded, Closed, and Compact Sets

Chapter 3 Numerical Methods

Convexity Theory and Gradient Methods

Linear methods for supervised learning

Constrained Optimization

Lagrangian Multipliers

Math 233. Lagrange Multipliers Basics

COMS 4771 Support Vector Machines. Nakul Verma

QEM Optimization, WS 2017/18 Part 4. Constrained optimization

1. Show that the rectangle of maximum area that has a given perimeter p is a square.

Chapter 1 Polynomials and Modeling

Optimization III: Constrained Optimization

Lecture 7: Support Vector Machine

1-2 Order of Operations. Evaluate each expression SOLUTION: SOLUTION: SOLUTION: SOLUTION: 5.

Introduction to Machine Learning

AM 221: Advanced Optimization Spring 2016

Lecture 25 Nonlinear Programming. November 9, 2009

1-2 Order of Operations. Evaluate each expression SOLUTION: ANSWER: SOLUTION: ANSWER: SOLUTION: ANSWER: 243

. Tutorial Class V 3-10/10/2012 First Order Partial Derivatives;...

Programming, numerics and optimization

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction

Chapter 15 Introduction to Linear Programming

Lagrange multipliers October 2013

Constrained Optimization COS 323

Section 4: Extreme Values & Lagrange Multipliers.

Write an expression in x that represents the length:

2 Second Derivatives. As we have seen, a function f (x, y) of two variables has four different partial derivatives: f xx. f yx. f x y.

Lagrangian Multipliers

Chapter 4 Linear Programming

Support Vector Machines. James McInerney Adapted from slides by Nakul Verma

Lagrange multipliers 14.8

Systems of Inequalities and Linear Programming 5.7 Properties of Matrices 5.8 Matrix Inverses

Machine Learning. Topic 5: Linear Discriminants. Bryan Pardo, EECS 349 Machine Learning, 2013

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

30. Constrained Optimization

Name: ID: Discussion Section:

Introduction to Modern Control Systems

Math 1120, Section 4 Calculus Test 2. November 5, 2008 Name. work. 1. (15 points) Consider the function f(x) = (2x + 3) 2 (x 1) 2.

Student Outcomes. Lesson Notes. Classwork. Discussion (4 minutes)

3.1 Graphing Using the First Derivative

Linear Programming has been used to:

14.5 Directional Derivatives and the Gradient Vector

Determine whether the relation represents a function. If it is a function, state the domain and range. 1)

MATH Lagrange multipliers in 3 variables Fall 2016

Real life Problem. Review

REVIEW I MATH 254 Calculus IV. Exam I (Friday, April 29) will cover sections

WHOLE NUMBERS AND DECIMALS

Part 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm

CS231A. Review for Problem Set 1. Saumitro Dasgupta

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Let s review some things we learned earlier about the information we can gather from the graph of a quadratic.

Chapter 1 & 2. Homework Ch 1 & 2

This is a function because no vertical line can be drawn so that it intersects the graph more than once.

Lecture 10: SVM Lecture Overview Support Vector Machines The binary classification problem

Math 3A Meadows or Malls? Review

TMA946/MAN280 APPLIED OPTIMIZATION. Exam instructions

Convexity. 1 X i is convex. = b is a hyperplane in R n, and is denoted H(p, b) i.e.,

Lecture 4 of Artificial Intelligence. Heuristic Search. Produced by Qiangfu Zhao (2008), All rights reserved

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

Optimal Separating Hyperplane and the Support Vector Machine. Volker Tresp Summer 2018

Decimals Quiz-Study Guide. Name: Hour: Date:

Name Class Date. Understanding Functions

Math 112 Spring 2016 Midterm 2 Review Problems Page 1

Sparse Optimization Lecture: Proximal Operator/Algorithm and Lagrange Dual

Algebra II 1 st Trimester Learning Targets

2.1. Rectangular Coordinates and Graphs. 2.1 Rectangular Coordinates and Graphs 2.2 Circles 2.3 Functions 2.4 Linear Functions. Graphs and Functions

Transcription:

minimize ½(x 1 2 + 2x 2 2 + x 32 ) subject to x 1 + x 2 + x 3 = 5

In this Chapter We will study a new theory for constrained optimization Local optimality condition Easier to implement Deeper insights to the geometrical structures Explicit use of constraint functions Convexity not required Continuity and differentiability required

The methods I set forth require neither constructions nor geometric or mechanical considerations. They require only algebraic operations subject to a systematic and uniform course. Lagrange (1736 1813)

Can you help a lady in love? A milkmaid is sent to get milk. She's in a hurry to get back for a date with a handsome young goatherd,so she wants to finish her job as quickly as possible. However, before she can gather the milk, she has to rinse out her bucket in the nearby river. Just when she reaches M, she spots the cow at C. She wants to take the shortest path from where she is to the river and then to the cow.

An example problem minimize f(x 1,x 2 ) = x 1 + x 2 subject to h(x 1,x 2 ) = (x 1-1) 2 + x 22 1 = 0 What is x*? What is f(x * )? What is h(x * )? What is the observation?

General Cases Consider an equality-constrained problem Minimize f(x) Subject to h(x) = 0 Consider a local optimal point on the boundary Geometric illustration of f(x) and h(x) Demo

A General Result If x* is local minimum of f subject to h(x) = 0, then we can find a vector λ such that f(x*) + i=1..m λ i h i (x*) = 0

But wait a minute Is it always true? What are the assumptions?

Consider this problem: minimize x 1 + x 2 s. t. (x 1 1) 2 + x 22 1 = 0 (x 1 2) 2 + x 2 2 4 = 0 How to visualize it? What are the Lagrange multipliers?

What is wrong? When h(x*) are linearly dependent, i.e. i=1..m λ i h i (x*) = 0 for some λ i The necessary condition may fail, since we cannot scale i=1..m λ i h i (x*) to match f Regular point: a point x is regular if h(x) are linearly independent

Lagrange Multiplier Theorem Let x be a local minimum and a regular point, then there exist unique scalars λ,..., λ 1 such that m f(x*) + i=1..m λ* i h i (x*) = 0

The Lagrange Condition A simple and elegant necessary condition Counterpart of the first-order condition for unconstrained optimization Many different views to arrive at this

The Lagrangian Function Define the Lagrangian function L(x, λ) = f(x) + i=1..m λ i h i (x) = f(x) + λ h(x) Then, if x is a local minimum and is regular, the Lagrange multiplier conditions are written as x L(x, λ ) = 0, λ L(x, λ ) = 0

Let s try it out minimize ½(x 1 2 + 2x 2 2 + x 32 ) subject to x 1 + x 2 + x 3 = 5 Can we use the Lagrange Multiplier Theorem to solve the problem?

Let s try it out minimize subject to -½(x 1 2 + 2x 2 2 + x 32 ) x 1 + x 2 + x 3 = 5 Can we use the Lagrange Multiplier Theorem to solve the problem?

A jewel box is to be constructed of material that costs $1 per square inch for the bottom, $2 per square inch for the sides, and $5 per square inch for the top. If the total volume is to be 96 in. 3, what dimensions will minimize the total cost of construction?

An editor has been allotted $60,000 to spend on the development and promotion of a new book. It is estimated that if x thousand dollars is spent on development and y thousand on promotion, approximately f(x, y) =20x 3/2 y copies of the book will be sold. How much money should the editor allocate to development and how much to promotion in order to maximize sales?

Second Order Sufficient Condition Let x and λ satisfy x L(x, λ ) = 0, λ L(x, λ ) = 0, y 2 xxl(x, λ ) y > 0, y <> 0 with h(x )y = 0. Then x is a strict local minimum.

Let s try it out minimize -(x 1 x 2 + x 2 x 3 + x 3 x 1 ) subject to x 1 + x 2 + x 3 = 3 Use the Lagrange multiplier theorem to answer the following: What is x * and λ *? Is x * a local minimum or maximum? (Sufficient condition for local minimum: y 2 xxl(x, λ ) y > 0, y <> 0 with h(x )y = 0 )

Suppose in the editor Example, the editor is allotted $61,000 instead of $60,000 to spend on the development and promotion of the new book. Compute how the additional $1,000 will affect the maximum sales level.

Interpretation of λ Lagrange multipliers have an interesting interpretation In economics, can be viewed as prices Consider the following problem: Minimize f(x), subject to a x = b Suppose (x *, λ ) is a solution-multiplier pair Suppose b is changed to b+δb, the minimum x will change to x +Δx What will be f(x +Δx) - f(x ) and how is the change related to λ?

Interpretation of λ (cont d) We have : a Δx = Δb and f(x ) = λ a, Δcost = f(x +Δx) f(x ) = f(x )Δx + o(δx) = λ a Δx + o(δx) Thus Δcost = λ Δb + o(δx), so up to first order λ = Δcost/Δb For multiple constraints a i x = b i, i = 1,..., m, we have, in first order, Δcost = i=1..m λ i Δb i Conclusion: λ represents the rate of change of the optimal cost as the level of constraint changes