Introduction to Constrained Optimization

Size: px
Start display at page:

Download "Introduction to Constrained Optimization"

Transcription

1 Introduction to Constrained Optimization Duality and KKT Conditions Pratik Shah {pratik.shah [at] lnmiit.ac.in} The LNM Institute of Information Technology February 13, 2013 LNMIIT MLPR Convex Optimization 1/14

2 Geometry of the Problem LNMIIT MLPR Convex Optimization 2/14

3 Geometry of the Problem Let us try to solve optimization problem min f (x) = min x 2 x R2 x R x2 2 (1) LNMIIT MLPR Convex Optimization 2/14

4 Geometry of the Problem Let us try to solve optimization problem subject to min f (x) = min x 2 x R2 x R x2 2 (1) g(x) = 2 x 1 x 2 0 (2) LNMIIT MLPR Convex Optimization 2/14

5 Geometry of the Problem Let us try to solve optimization problem subject to min f (x) = min x 2 x R2 x R x2 2 (1) g(x) = 2 x 1 x 2 0 (2) Constrained optimization demands more than stationarity at optimal point, i.e. x f (x) = 0 is not sufficient. Why? LNMIIT MLPR Convex Optimization 2/14

6 How do we treat Constraint? LNMIIT MLPR Convex Optimization 3/14

7 How do we treat Constraint? Solve this constrained optimization with a Lagrange multiplier L(x, α) = f (x) + αg(x) (3) = (x x 2 2 ) + α(2 x 1 x 2 ) LNMIIT MLPR Convex Optimization 3/14

8 How do we treat Constraint? Solve this constrained optimization with a Lagrange multiplier and the solution is L(x, α) = f (x) + αg(x) (3) = (x x 2 2 ) + α(2 x 1 x 2 ) x = (x 1, x 2 ) = (1, 1) (4) Before proceeding further let us first fix some notations and concepts LNMIIT MLPR Convex Optimization 3/14

9 Convex Optimization [1] Study of mathematical optimization problems of the form, minimize f (x) x R n subject to x C (5) x R n is a vector known as the optimization variable, f : R n R is a convex function that we want to minimize, and C R n is a convex set describing the set of feasible solutions Recall the definition of a Convex set and a Convex function 1 1 A function f : S R is convex if S is a convex set, and for any x, y S and λ [0, 1], we have f (λx + (1 λ)y) λf (x) + (1 λ)f (y). A function f is concave if f is convex. LNMIIT MLPR Convex Optimization 4/14

10 Lagrange Duality LNMIIT MLPR Convex Optimization 5/14

11 Lagrange Duality The theory of Lagrange duality is the study of optimal solutions to convex optimization problems LNMIIT MLPR Convex Optimization 5/14

12 Lagrange Duality The theory of Lagrange duality is the study of optimal solutions to convex optimization problems In this lecture we will discuss about differentiable convex optimization problems of the form minimize f (x) x R n subject to g i (x) 0, i = 1,..., m h i (x) = 0, i = 1,..., p (6) where f and g i are differentiable convex functions and h i are affine functions LNMIIT MLPR Convex Optimization 5/14

13 The Lagrangian LNMIIT MLPR Convex Optimization 6/14

14 The Lagrangian Intuitively, the Lagrangian can be thought of as a modified version of the original objective function. The Lagrangian L : R n R m R p R, is defined as L(x, α, β) = f (x) + m α i g i (x) + i=1 p β i h i (x). (7) i=1 LNMIIT MLPR Convex Optimization 6/14

15 The Lagrangian Intuitively, the Lagrangian can be thought of as a modified version of the original objective function. The Lagrangian L : R n R m R p R, is defined as L(x, α, β) = f (x) + m α i g i (x) + i=1 p β i h i (x). (7) i=1 Primal variables : x R n Dual variables : α i R m and β i R p LNMIIT MLPR Convex Optimization 6/14

16 The Lagrangian Intuitively, the Lagrangian can be thought of as a modified version of the original objective function. The Lagrangian L : R n R m R p R, is defined as L(x, α, β) = f (x) + m α i g i (x) + i=1 p β i h i (x). (7) i=1 Primal variables : x R n Dual variables : α i R m and β i R p The Lagrange multipliers α i and β i can be though of as costs associated with violating different constraints LNMIIT MLPR Convex Optimization 6/14

17 Intuition behind Lagrange Duality For any convex optimization problem, there always exist settings of the dual variables such that the unconstrained minimum of the Lagrangian with respect to the primal variables (keeping the dual variables fixed) coincides with the solution of the original constrained optimization problem [2]. LNMIIT MLPR Convex Optimization 7/14

18 Primal Problem Consider the optimization problem, { } min max L(x, α, β) x α,β:α i 0, i } {{ } call this θ P (x) = min x θ P (x) (8) The function θ P : R n R is called the primal objective, and the unconstrained optimization problem on the right hand side is known as the primal problem A point x R n is primal feasible if g i (x) 0, i = 1,..., m and h i (x) = 0, i = 1,..., p The optimal value of the primal objective is denoted by p = θ P (x ) and x R n is called the solution of primal problem LNMIIT MLPR Convex Optimization 8/14

19 Interpretation of Primal Problem 2 θ P (x) = max α,β:α i 0, i = max α,β:α i 0, i = f (x) + max α,β:α i 0, i L(x, α, β) (9) [ ] m p f (x) + α i g i (x) + β i h i (x) i=1 [ m α i g i (x) + i=1 i=1 ] p β i h i (x) i=1 2 Observe that the primal objective, θ P(x), is a convex function of x LNMIIT MLPR Convex Optimization 9/14

20 Interpretation of Primal Problem 2 θ P (x) = max α,β:α i 0, i = max α,β:α i 0, i = f (x) + max α,β:α i 0, i L(x, α, β) (9) [ ] m p f (x) + α i g i (x) + β i h i (x) i=1 [ m α i g i (x) + i=1 i=1 ] p β i h i (x) i=1 θ P (x) = { 0 if x is primal feasible f (x) + }{{} if x is primal infeasible original objective }{{} barrier function for carving away infeasible solutions (10) 2 Observe that the primal objective, θ P(x), is a convex function of x LNMIIT MLPR Convex Optimization 9/14

21 Dual Problem Switching the order of min and max we obtain an entirely different optimization problem { } max min L(x, α, β) = max θ α,β:α i 0, i } x D(α, β) (11) α,β:α {{} i 0, i call this θ D (α,β) The function θ D : R m R p R is called the dual objective, and the unconstrained optimization problem on the right hand side is known as the dual problem Generally, we say that (α, β) are dual feasible if α i 0, i = 1,..., m The optimal value of the dual objective is denoted by d = θ D (α, β ) and (α, β ) R m R p is called the solution of the dual problem LNMIIT MLPR Convex Optimization 10/14

22 Recall the Example Let us re-visit the example that we solved in the beginning min f (x) = min x 2 x R2 x R x2 2 (12) LNMIIT MLPR Convex Optimization 11/14

23 Recall the Example Let us re-visit the example that we solved in the beginning subject to min f (x) = min x 2 x R2 x R x2 2 (12) g(x) = 2 x 1 x 2 0 (13) LNMIIT MLPR Convex Optimization 11/14

24 Recall the Example Let us re-visit the example that we solved in the beginning subject to min f (x) = min x 2 x R2 x R x2 2 (12) g(x) = 2 x 1 x 2 0 (13) Do you see primal and dual formulations? LNMIIT MLPR Convex Optimization 11/14

25 Interpretation of Dual Problem LNMIIT MLPR Convex Optimization 12/14

26 Interpretation of Dual Problem Lemma 1. If (α, β) are dual feasible, then θ D (α, β) p. 2. (Weak Duality) For any pair of primal and dual problems, d p. 3. (Strong Duality) For any pair of primal and dual problems which satisfy certain technical conditions called constraint qualification, then d = p. 4. (Complementary slackness) If strong duality holds, then α i g i(x ) = 0 for each i = 1,..., m. LNMIIT MLPR Convex Optimization 12/14

27 The KKT Conditions [3] LNMIIT MLPR Convex Optimization 13/14

28 The KKT Conditions [3] Theorem Suppose that x R n, α R m and β R p satisfy the following conditions: LNMIIT MLPR Convex Optimization 13/14

29 The KKT Conditions [3] Theorem Suppose that x R n, α R m and β R p satisfy the following conditions: 1. (Primal feasibility) g i (x ) 0, i = 1,..., m and h i (x ) = 0, i = 1,..., p, LNMIIT MLPR Convex Optimization 13/14

30 The KKT Conditions [3] Theorem Suppose that x R n, α R m and β R p satisfy the following conditions: 1. (Primal feasibility) g i (x ) 0, i = 1,..., m and h i (x ) = 0, i = 1,..., p, 2. (Dual feasibility) α i 0, i = 1,..., p, LNMIIT MLPR Convex Optimization 13/14

31 The KKT Conditions [3] Theorem Suppose that x R n, α R m and β R p satisfy the following conditions: 1. (Primal feasibility) g i (x ) 0, i = 1,..., m and h i (x ) = 0, i = 1,..., p, 2. (Dual feasibility) α i 0, i = 1,..., p, 3. (Complementary slackness) α i g i(x ) = 0 for each i = 1,..., m, and LNMIIT MLPR Convex Optimization 13/14

32 The KKT Conditions [3] Theorem Suppose that x R n, α R m and β R p satisfy the following conditions: 1. (Primal feasibility) g i (x ) 0, i = 1,..., m and h i (x ) = 0, i = 1,..., p, 2. (Dual feasibility) α i 0, i = 1,..., p, 3. (Complementary slackness) α i g i(x ) = 0 for each i = 1,..., m, and 4. (Lagrangian stationarity) x L(x, α, β ) = 0. LNMIIT MLPR Convex Optimization 13/14

33 The KKT Conditions [3] Theorem Suppose that x R n, α R m and β R p satisfy the following conditions: 1. (Primal feasibility) g i (x ) 0, i = 1,..., m and h i (x ) = 0, i = 1,..., p, 2. (Dual feasibility) α i 0, i = 1,..., p, 3. (Complementary slackness) α i g i(x ) = 0 for each i = 1,..., m, and 4. (Lagrangian stationarity) x L(x, α, β ) = 0. Then x is primal optimal and (α, β ) are dual optimal. furhtermore, if strong duality holds, then any primal optimal x and dual optimal (α, β ) must satisfy the conditions 1 through 4. LNMIIT MLPR Convex Optimization 13/14

34 The KKT Conditions [3] Theorem Suppose that x R n, α R m and β R p satisfy the following conditions: 1. (Primal feasibility) g i (x ) 0, i = 1,..., m and h i (x ) = 0, i = 1,..., p, 2. (Dual feasibility) α i 0, i = 1,..., p, 3. (Complementary slackness) α i g i(x ) = 0 for each i = 1,..., m, and 4. (Lagrangian stationarity) x L(x, α, β ) = 0. Then x is primal optimal and (α, β ) are dual optimal. furhtermore, if strong duality holds, then any primal optimal x and dual optimal (α, β ) must satisfy the conditions 1 through 4. These conditions are known as the Karush-Kuhn-Tucker (KKT) conditions. LNMIIT MLPR Convex Optimization 13/14

35 References S. Boyd and L. Vandenberghe. Convex Optimization. Berichte über verteilte messysteme. Cambridge University Press, Chuong B. Do. Convex optimization overview (cntd), J. A. Nocedal and S. J. Wright. Numerical Optimization. Springer Series in Operations Research Series. Springer-Verlag GmbH, LNMIIT MLPR Convex Optimization 14/14

Applied Lagrange Duality for Constrained Optimization

Applied Lagrange Duality for Constrained Optimization Applied Lagrange Duality for Constrained Optimization Robert M. Freund February 10, 2004 c 2004 Massachusetts Institute of Technology. 1 1 Overview The Practical Importance of Duality Review of Convexity

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Maximum Margin Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE 474/574

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 2 Convex Optimization Shiqian Ma, MAT-258A: Numerical Optimization 2 2.1. Convex Optimization General optimization problem: min f 0 (x) s.t., f i

More information

Kernels and Constrained Optimization

Kernels and Constrained Optimization Machine Learning 1 WS2014 Module IN2064 Sheet 8 Page 1 Machine Learning Worksheet 8 Kernels and Constrained Optimization 1 Kernelized k-nearest neighbours To classify the point x the k-nearest neighbours

More information

Convex Optimization and Machine Learning

Convex Optimization and Machine Learning Convex Optimization and Machine Learning Mengliu Zhao Machine Learning Reading Group School of Computing Science Simon Fraser University March 12, 2014 Mengliu Zhao SFU-MLRG March 12, 2014 1 / 25 Introduction

More information

Convex Optimization. Lijun Zhang Modification of

Convex Optimization. Lijun Zhang   Modification of Convex Optimization Lijun Zhang zlj@nju.edu.cn http://cs.nju.edu.cn/zlj Modification of http://stanford.edu/~boyd/cvxbook/bv_cvxslides.pdf Outline Introduction Convex Sets & Functions Convex Optimization

More information

Optimization III: Constrained Optimization

Optimization III: Constrained Optimization Optimization III: Constrained Optimization CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Optimization III: Constrained Optimization

More information

Linear methods for supervised learning

Linear methods for supervised learning Linear methods for supervised learning LDA Logistic regression Naïve Bayes PLA Maximum margin hyperplanes Soft-margin hyperplanes Least squares resgression Ridge regression Nonlinear feature maps Sometimes

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June

More information

Lecture 7: Support Vector Machine

Lecture 7: Support Vector Machine Lecture 7: Support Vector Machine Hien Van Nguyen University of Houston 9/28/2017 Separating hyperplane Red and green dots can be separated by a separating hyperplane Two classes are separable, i.e., each

More information

COMS 4771 Support Vector Machines. Nakul Verma

COMS 4771 Support Vector Machines. Nakul Verma COMS 4771 Support Vector Machines Nakul Verma Last time Decision boundaries for classification Linear decision boundary (linear classification) The Perceptron algorithm Mistake bound for the perceptron

More information

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods 1 Nonlinear Programming Types of Nonlinear Programs (NLP) Convexity and Convex Programs NLP Solutions Unconstrained Optimization Principles of Unconstrained Optimization Search Methods Constrained Optimization

More information

Mathematical Programming and Research Methods (Part II)

Mathematical Programming and Research Methods (Part II) Mathematical Programming and Research Methods (Part II) 4. Convexity and Optimization Massimiliano Pontil (based on previous lecture by Andreas Argyriou) 1 Today s Plan Convex sets and functions Types

More information

Lecture 15: Log Barrier Method

Lecture 15: Log Barrier Method 10-725/36-725: Convex Optimization Spring 2015 Lecturer: Ryan Tibshirani Lecture 15: Log Barrier Method Scribes: Pradeep Dasigi, Mohammad Gowayyed Note: LaTeX template courtesy of UC Berkeley EECS dept.

More information

Characterizing Improving Directions Unconstrained Optimization

Characterizing Improving Directions Unconstrained Optimization Final Review IE417 In the Beginning... In the beginning, Weierstrass's theorem said that a continuous function achieves a minimum on a compact set. Using this, we showed that for a convex set S and y not

More information

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs Introduction to Mathematical Programming IE496 Final Review Dr. Ted Ralphs IE496 Final Review 1 Course Wrap-up: Chapter 2 In the introduction, we discussed the general framework of mathematical modeling

More information

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach Basic approaches I. Primal Approach - Feasible Direction

More information

Demo 1: KKT conditions with inequality constraints

Demo 1: KKT conditions with inequality constraints MS-C5 Introduction to Optimization Solutions 9 Ehtamo Demo : KKT conditions with inequality constraints Using the Karush-Kuhn-Tucker conditions, see if the points x (x, x ) (, 4) or x (x, x ) (6, ) are

More information

Constrained Optimization

Constrained Optimization Constrained Optimization Dudley Cooke Trinity College Dublin Dudley Cooke (Trinity College Dublin) Constrained Optimization 1 / 46 EC2040 Topic 5 - Constrained Optimization Reading 1 Chapters 12.1-12.3

More information

Optimization under uncertainty: modeling and solution methods

Optimization under uncertainty: modeling and solution methods Optimization under uncertainty: modeling and solution methods Paolo Brandimarte Dipartimento di Scienze Matematiche Politecnico di Torino e-mail: paolo.brandimarte@polito.it URL: http://staff.polito.it/paolo.brandimarte

More information

Nonlinear Programming

Nonlinear Programming Nonlinear Programming SECOND EDITION Dimitri P. Bertsekas Massachusetts Institute of Technology WWW site for book Information and Orders http://world.std.com/~athenasc/index.html Athena Scientific, Belmont,

More information

California Institute of Technology Crash-Course on Convex Optimization Fall Ec 133 Guilherme Freitas

California Institute of Technology Crash-Course on Convex Optimization Fall Ec 133 Guilherme Freitas California Institute of Technology HSS Division Crash-Course on Convex Optimization Fall 2011-12 Ec 133 Guilherme Freitas In this text, we will study the following basic problem: maximize x C f(x) subject

More information

Lecture 5: Duality Theory

Lecture 5: Duality Theory Lecture 5: Duality Theory Rajat Mittal IIT Kanpur The objective of this lecture note will be to learn duality theory of linear programming. We are planning to answer following questions. What are hyperplane

More information

Support Vector Machines. James McInerney Adapted from slides by Nakul Verma

Support Vector Machines. James McInerney Adapted from slides by Nakul Verma Support Vector Machines James McInerney Adapted from slides by Nakul Verma Last time Decision boundaries for classification Linear decision boundary (linear classification) The Perceptron algorithm Mistake

More information

CME307/MS&E311 Theory Summary

CME307/MS&E311 Theory Summary CME307/MS&E311 Theory Summary Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/~yyye http://www.stanford.edu/class/msande311/

More information

Lec 11 Rate-Distortion Optimization (RDO) in Video Coding-I

Lec 11 Rate-Distortion Optimization (RDO) in Video Coding-I CS/EE 5590 / ENG 401 Special Topics (17804, 17815, 17803) Lec 11 Rate-Distortion Optimization (RDO) in Video Coding-I Zhu Li Course Web: http://l.web.umkc.edu/lizhu/teaching/2016sp.video-communication/main.html

More information

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited. page v Preface xiii I Basics 1 1 Optimization Models 3 1.1 Introduction... 3 1.2 Optimization: An Informal Introduction... 4 1.3 Linear Equations... 7 1.4 Linear Optimization... 10 Exercises... 12 1.5

More information

Convex Optimization. Erick Delage, and Ashutosh Saxena. October 20, (a) (b) (c)

Convex Optimization. Erick Delage, and Ashutosh Saxena. October 20, (a) (b) (c) Convex Optimization (for CS229) Erick Delage, and Ashutosh Saxena October 20, 2006 1 Convex Sets Definition: A set G R n is convex if every pair of point (x, y) G, the segment beteen x and y is in A. More

More information

Introduction to Modern Control Systems

Introduction to Modern Control Systems Introduction to Modern Control Systems Convex Optimization, Duality and Linear Matrix Inequalities Kostas Margellos University of Oxford AIMS CDT 2016-17 Introduction to Modern Control Systems November

More information

A Short SVM (Support Vector Machine) Tutorial

A Short SVM (Support Vector Machine) Tutorial A Short SVM (Support Vector Machine) Tutorial j.p.lewis CGIT Lab / IMSC U. Southern California version 0.zz dec 004 This tutorial assumes you are familiar with linear algebra and equality-constrained optimization/lagrange

More information

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer David G. Luenberger Yinyu Ye Linear and Nonlinear Programming Fourth Edition ö Springer Contents 1 Introduction 1 1.1 Optimization 1 1.2 Types of Problems 2 1.3 Size of Problems 5 1.4 Iterative Algorithms

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Constrained Optimization Marc Toussaint U Stuttgart Constrained Optimization General constrained optimization problem: Let R n, f : R n R, g : R n R m, h : R n R l find min

More information

Convexity Theory and Gradient Methods

Convexity Theory and Gradient Methods Convexity Theory and Gradient Methods Angelia Nedić angelia@illinois.edu ISE Department and Coordinated Science Laboratory University of Illinois at Urbana-Champaign Outline Convex Functions Optimality

More information

ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints

ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints Instructor: Kevin Ross Scribe: Pritam Roy May 0, 2005 Outline of topics for the lecture We will discuss

More information

Lecture 10: SVM Lecture Overview Support Vector Machines The binary classification problem

Lecture 10: SVM Lecture Overview Support Vector Machines The binary classification problem Computational Learning Theory Fall Semester, 2012/13 Lecture 10: SVM Lecturer: Yishay Mansour Scribe: Gitit Kehat, Yogev Vaknin and Ezra Levin 1 10.1 Lecture Overview In this lecture we present in detail

More information

Linear Programming. Larry Blume. Cornell University & The Santa Fe Institute & IHS

Linear Programming. Larry Blume. Cornell University & The Santa Fe Institute & IHS Linear Programming Larry Blume Cornell University & The Santa Fe Institute & IHS Linear Programs The general linear program is a constrained optimization problem where objectives and constraints are all

More information

Solution Methods Numerical Algorithms

Solution Methods Numerical Algorithms Solution Methods Numerical Algorithms Evelien van der Hurk DTU Managment Engineering Class Exercises From Last Time 2 DTU Management Engineering 42111: Static and Dynamic Optimization (6) 09/10/2017 Class

More information

College of Computer & Information Science Fall 2007 Northeastern University 14 September 2007

College of Computer & Information Science Fall 2007 Northeastern University 14 September 2007 College of Computer & Information Science Fall 2007 Northeastern University 14 September 2007 CS G399: Algorithmic Power Tools I Scribe: Eric Robinson Lecture Outline: Linear Programming: Vertex Definitions

More information

Lecture 4 Duality and Decomposition Techniques

Lecture 4 Duality and Decomposition Techniques Lecture 4 Duality and Decomposition Techniques Jie Lu (jielu@kth.se) Richard Combes Alexandre Proutiere Automatic Control, KTH September 19, 2013 Consider the primal problem Lagrange Duality Lagrangian

More information

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING KELLER VANDEBOGERT AND CHARLES LANNING 1. Introduction Interior point methods are, put simply, a technique of optimization where, given a problem

More information

In other words, we want to find the domain points that yield the maximum or minimum values (extrema) of the function.

In other words, we want to find the domain points that yield the maximum or minimum values (extrema) of the function. 1 The Lagrange multipliers is a mathematical method for performing constrained optimization of differentiable functions. Recall unconstrained optimization of differentiable functions, in which we want

More information

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs Advanced Operations Research Techniques IE316 Quiz 2 Review Dr. Ted Ralphs IE316 Quiz 2 Review 1 Reading for The Quiz Material covered in detail in lecture Bertsimas 4.1-4.5, 4.8, 5.1-5.5, 6.1-6.3 Material

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Second Order Optimization Methods Marc Toussaint U Stuttgart Planned Outline Gradient-based optimization (1st order methods) plain grad., steepest descent, conjugate grad.,

More information

CME307/MS&E311 Optimization Theory Summary

CME307/MS&E311 Optimization Theory Summary CME307/MS&E311 Optimization Theory Summary Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/~yyye http://www.stanford.edu/class/msande311/

More information

9. Support Vector Machines. The linearly separable case: hard-margin SVMs. The linearly separable case: hard-margin SVMs. Learning objectives

9. Support Vector Machines. The linearly separable case: hard-margin SVMs. The linearly separable case: hard-margin SVMs. Learning objectives Foundations of Machine Learning École Centrale Paris Fall 25 9. Support Vector Machines Chloé-Agathe Azencot Centre for Computational Biology, Mines ParisTech Learning objectives chloe agathe.azencott@mines

More information

QEM Optimization, WS 2017/18 Part 4. Constrained optimization

QEM Optimization, WS 2017/18 Part 4. Constrained optimization QEM Optimization, WS 2017/18 Part 4 Constrained optimization (about 4 Lectures) Supporting Literature: Angel de la Fuente, Mathematical Methods and Models for Economists, Chapter 7 Contents 4 Constrained

More information

Lecture 19: Convex Non-Smooth Optimization. April 2, 2007

Lecture 19: Convex Non-Smooth Optimization. April 2, 2007 : Convex Non-Smooth Optimization April 2, 2007 Outline Lecture 19 Convex non-smooth problems Examples Subgradients and subdifferentials Subgradient properties Operations with subgradients and subdifferentials

More information

Programs. Introduction

Programs. Introduction 16 Interior Point I: Linear Programs Lab Objective: For decades after its invention, the Simplex algorithm was the only competitive method for linear programming. The past 30 years, however, have seen

More information

Constrained optimization

Constrained optimization Constrained optimization A general constrained optimization problem has the form where The Lagrangian function is given by Primal and dual optimization problems Primal: Dual: Weak duality: Strong duality:

More information

5 Day 5: Maxima and minima for n variables.

5 Day 5: Maxima and minima for n variables. UNIVERSITAT POMPEU FABRA INTERNATIONAL BUSINESS ECONOMICS MATHEMATICS III. Pelegrí Viader. 2012-201 Updated May 14, 201 5 Day 5: Maxima and minima for n variables. The same kind of first-order and second-order

More information

Convex Optimization MLSS 2015

Convex Optimization MLSS 2015 Convex Optimization MLSS 2015 Constantine Caramanis The University of Texas at Austin The Optimization Problem minimize : f (x) subject to : x X. The Optimization Problem minimize : f (x) subject to :

More information

Lecture Notes: Constraint Optimization

Lecture Notes: Constraint Optimization Lecture Notes: Constraint Optimization Gerhard Neumann January 6, 2015 1 Constraint Optimization Problems In constraint optimization we want to maximize a function f(x) under the constraints that g i (x)

More information

Lecture 2 Optimization with equality constraints

Lecture 2 Optimization with equality constraints Lecture 2 Optimization with equality constraints Constrained optimization The idea of constrained optimisation is that the choice of one variable often affects the amount of another variable that can be

More information

Linear programming and duality theory

Linear programming and duality theory Linear programming and duality theory Complements of Operations Research Giovanni Righini Linear Programming (LP) A linear program is defined by linear constraints, a linear objective function. Its variables

More information

Sparse Optimization Lecture: Proximal Operator/Algorithm and Lagrange Dual

Sparse Optimization Lecture: Proximal Operator/Algorithm and Lagrange Dual Sparse Optimization Lecture: Proximal Operator/Algorithm and Lagrange Dual Instructor: Wotao Yin July 2013 online discussions on piazza.com Those who complete this lecture will know learn the proximal

More information

11 Linear Programming

11 Linear Programming 11 Linear Programming 11.1 Definition and Importance The final topic in this course is Linear Programming. We say that a problem is an instance of linear programming when it can be effectively expressed

More information

Gate Sizing by Lagrangian Relaxation Revisited

Gate Sizing by Lagrangian Relaxation Revisited Gate Sizing by Lagrangian Relaxation Revisited Jia Wang, Debasish Das, and Hai Zhou Electrical Engineering and Computer Science Northwestern University Evanston, Illinois, United States October 17, 2007

More information

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017 Section Notes 5 Review of Linear Programming Applied Math / Engineering Sciences 121 Week of October 15, 2017 The following list of topics is an overview of the material that was covered in the lectures

More information

Lagrangian Relaxation: An overview

Lagrangian Relaxation: An overview Discrete Math for Bioinformatics WS 11/12:, by A. Bockmayr/K. Reinert, 22. Januar 2013, 13:27 4001 Lagrangian Relaxation: An overview Sources for this lecture: D. Bertsimas and J. Tsitsiklis: Introduction

More information

Convex Programs. COMPSCI 371D Machine Learning. COMPSCI 371D Machine Learning Convex Programs 1 / 21

Convex Programs. COMPSCI 371D Machine Learning. COMPSCI 371D Machine Learning Convex Programs 1 / 21 Convex Programs COMPSCI 371D Machine Learning COMPSCI 371D Machine Learning Convex Programs 1 / 21 Logistic Regression! Support Vector Machines Support Vector Machines (SVMs) and Convex Programs SVMs are

More information

Projection onto the probability simplex: An efficient algorithm with a simple proof, and an application

Projection onto the probability simplex: An efficient algorithm with a simple proof, and an application Proection onto the probability simplex: An efficient algorithm with a simple proof, and an application Weiran Wang Miguel Á. Carreira-Perpiñán Electrical Engineering and Computer Science, University of

More information

Kernel Methods & Support Vector Machines

Kernel Methods & Support Vector Machines & Support Vector Machines & Support Vector Machines Arvind Visvanathan CSCE 970 Pattern Recognition 1 & Support Vector Machines Question? Draw a single line to separate two classes? 2 & Support Vector

More information

Math 5593 Linear Programming Lecture Notes

Math 5593 Linear Programming Lecture Notes Math 5593 Linear Programming Lecture Notes Unit II: Theory & Foundations (Convex Analysis) University of Colorado Denver, Fall 2013 Topics 1 Convex Sets 1 1.1 Basic Properties (Luenberger-Ye Appendix B.1).........................

More information

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs Introduction to Mathematical Programming IE406 Lecture 20 Dr. Ted Ralphs IE406 Lecture 20 1 Reading for This Lecture Bertsimas Sections 10.1, 11.4 IE406 Lecture 20 2 Integer Linear Programming An integer

More information

Section Notes 4. Duality, Sensitivity, and the Dual Simplex Algorithm. Applied Math / Engineering Sciences 121. Week of October 8, 2018

Section Notes 4. Duality, Sensitivity, and the Dual Simplex Algorithm. Applied Math / Engineering Sciences 121. Week of October 8, 2018 Section Notes 4 Duality, Sensitivity, and the Dual Simplex Algorithm Applied Math / Engineering Sciences 121 Week of October 8, 2018 Goals for the week understand the relationship between primal and dual

More information

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP):

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP): Linköping University Optimization TAOP3(0) Department of Mathematics Examination Oleg Burdakov of 30 October 03 Assignment Consider the following linear programming problem (LP): max z = x + x s.t. x x

More information

A Truncated Newton Method in an Augmented Lagrangian Framework for Nonlinear Programming

A Truncated Newton Method in an Augmented Lagrangian Framework for Nonlinear Programming A Truncated Newton Method in an Augmented Lagrangian Framework for Nonlinear Programming Gianni Di Pillo (dipillo@dis.uniroma1.it) Giampaolo Liuzzi (liuzzi@iasi.cnr.it) Stefano Lucidi (lucidi@dis.uniroma1.it)

More information

Interior Point I. Lab 21. Introduction

Interior Point I. Lab 21. Introduction Lab 21 Interior Point I Lab Objective: For decades after its invention, the Simplex algorithm was the only competitive method for linear programming. The past 30 years, however, have seen the discovery

More information

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini DM545 Linear and Integer Programming Lecture 2 The Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. 2. 3. 4. Standard Form Basic Feasible Solutions

More information

IE 521 Convex Optimization

IE 521 Convex Optimization Lecture 4: 5th February 2019 Outline 1 / 23 Which function is different from others? Figure: Functions 2 / 23 Definition of Convex Function Definition. A function f (x) : R n R is convex if (i) dom(f )

More information

Duality. Primal program P: Maximize n. Dual program D: Minimize m. j=1 c jx j subject to n. j=1. i=1 b iy i subject to m. i=1

Duality. Primal program P: Maximize n. Dual program D: Minimize m. j=1 c jx j subject to n. j=1. i=1 b iy i subject to m. i=1 Duality Primal program P: Maximize n j=1 c jx j subject to n a ij x j b i, i = 1, 2,..., m j=1 x j 0, j = 1, 2,..., n Dual program D: Minimize m i=1 b iy i subject to m a ij x j c j, j = 1, 2,..., n i=1

More information

A primal-dual framework for mixtures of regularizers

A primal-dual framework for mixtures of regularizers A primal-dual framework for mixtures of regularizers Baran Gözcü baran.goezcue@epfl.ch Laboratory for Information and Inference Systems (LIONS) École Polytechnique Fédérale de Lausanne (EPFL) Switzerland

More information

Repetition: Primal Dual for Set Cover

Repetition: Primal Dual for Set Cover Repetition: Primal Dual for Set Cover Primal Relaxation: k min i=1 w ix i s.t. u U i:u S i x i 1 i {1,..., k} x i 0 Dual Formulation: max u U y u s.t. i {1,..., k} u:u S i y u w i y u 0 Harald Räcke 428

More information

Chapter II. Linear Programming

Chapter II. Linear Programming 1 Chapter II Linear Programming 1. Introduction 2. Simplex Method 3. Duality Theory 4. Optimality Conditions 5. Applications (QP & SLP) 6. Sensitivity Analysis 7. Interior Point Methods 1 INTRODUCTION

More information

Lagrangian Relaxation

Lagrangian Relaxation Lagrangian Relaxation Ş. İlker Birbil March 6, 2016 where Our general optimization problem in this lecture is in the maximization form: maximize f(x) subject to x F, F = {x R n : g i (x) 0, i = 1,...,

More information

Lecture 3: Convex sets

Lecture 3: Convex sets Lecture 3: Convex sets Rajat Mittal IIT Kanpur We denote the set of real numbers as R. Most of the time we will be working with space R n and its elements will be called vectors. Remember that a subspace

More information

Elements of Economic Analysis II Lecture III: Cost Minimization, Factor Demand and Cost Function

Elements of Economic Analysis II Lecture III: Cost Minimization, Factor Demand and Cost Function Elements of Economic Analysis II Lecture III: Cost Minimization, Factor Demand and Cost Function Kai Hao Yang 10/05/2017 1 Cost Minimization In the last lecture, we saw a firm s profit maximization problem.

More information

Linear Programming Duality and Algorithms

Linear Programming Duality and Algorithms COMPSCI 330: Design and Analysis of Algorithms 4/5/2016 and 4/7/2016 Linear Programming Duality and Algorithms Lecturer: Debmalya Panigrahi Scribe: Tianqi Song 1 Overview In this lecture, we will cover

More information

Some Advanced Topics in Linear Programming

Some Advanced Topics in Linear Programming Some Advanced Topics in Linear Programming Matthew J. Saltzman July 2, 995 Connections with Algebra and Geometry In this section, we will explore how some of the ideas in linear programming, duality theory,

More information

Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2

Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2 Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2 X. Zhao 3, P. B. Luh 4, and J. Wang 5 Communicated by W.B. Gong and D. D. Yao 1 This paper is dedicated to Professor Yu-Chi Ho for his 65th birthday.

More information

George B. Dantzig Mukund N. Thapa. Linear Programming. 1: Introduction. With 87 Illustrations. Springer

George B. Dantzig Mukund N. Thapa. Linear Programming. 1: Introduction. With 87 Illustrations. Springer George B. Dantzig Mukund N. Thapa Linear Programming 1: Introduction With 87 Illustrations Springer Contents FOREWORD PREFACE DEFINITION OF SYMBOLS xxi xxxiii xxxvii 1 THE LINEAR PROGRAMMING PROBLEM 1

More information

Approximation Algorithms: The Primal-Dual Method. My T. Thai

Approximation Algorithms: The Primal-Dual Method. My T. Thai Approximation Algorithms: The Primal-Dual Method My T. Thai 1 Overview of the Primal-Dual Method Consider the following primal program, called P: min st n c j x j j=1 n a ij x j b i j=1 x j 0 Then the

More information

1. Lecture notes on bipartite matching February 4th,

1. Lecture notes on bipartite matching February 4th, 1. Lecture notes on bipartite matching February 4th, 2015 6 1.1.1 Hall s Theorem Hall s theorem gives a necessary and sufficient condition for a bipartite graph to have a matching which saturates (or matches)

More information

EC5555 Economics Masters Refresher Course in Mathematics September Lecture 6 Optimization with equality constraints Francesco Feri

EC5555 Economics Masters Refresher Course in Mathematics September Lecture 6 Optimization with equality constraints Francesco Feri EC5555 Economics Masters Refresher Course in Mathematics September 2013 Lecture 6 Optimization with equality constraints Francesco Feri Constrained optimization The idea of constrained optimisation is

More information

AM 221: Advanced Optimization Spring 2016

AM 221: Advanced Optimization Spring 2016 AM 221: Advanced Optimization Spring 2016 Prof. Yaron Singer Lecture 2 Wednesday, January 27th 1 Overview In our previous lecture we discussed several applications of optimization, introduced basic terminology,

More information

ORIE 6300 Mathematical Programming I September 2, Lecture 3

ORIE 6300 Mathematical Programming I September 2, Lecture 3 ORIE 6300 Mathematical Programming I September 2, 2014 Lecturer: David P. Williamson Lecture 3 Scribe: Divya Singhvi Last time we discussed how to take dual of an LP in two different ways. Today we will

More information

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014 5/2/24 Outline CS38 Introduction to Algorithms Lecture 5 May 2, 24 Linear programming simplex algorithm LP duality ellipsoid algorithm * slides from Kevin Wayne May 2, 24 CS38 Lecture 5 May 2, 24 CS38

More information

Lagrangian Multipliers

Lagrangian Multipliers Università Ca Foscari di Venezia - Dipartimento di Management - A.A.2017-2018 Mathematics Lagrangian Multipliers Luciano Battaia November 15, 2017 1 Two variables functions and constraints Consider a two

More information

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Mathematical and Algorithmic Foundations Linear Programming and Matchings Adavnced Algorithms Lectures Mathematical and Algorithmic Foundations Linear Programming and Matchings Paul G. Spirakis Department of Computer Science University of Patras and Liverpool Paul G. Spirakis

More information

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36 CS 473: Algorithms Ruta Mehta University of Illinois, Urbana-Champaign Spring 2018 Ruta (UIUC) CS473 1 Spring 2018 1 / 36 CS 473: Algorithms, Spring 2018 LP Duality Lecture 20 April 3, 2018 Some of the

More information

Introduction to optimization

Introduction to optimization Introduction to optimization G. Ferrari Trecate Dipartimento di Ingegneria Industriale e dell Informazione Università degli Studi di Pavia Industrial Automation Ferrari Trecate (DIS) Optimization Industrial

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Group Members: 1. Geng Xue (A0095628R) 2. Cai Jingli (A0095623B) 3. Xing Zhe (A0095644W) 4. Zhu Xiaolu (A0109657W) 5. Wang Zixiao (A0095670X) 6. Jiao Qing (A0095637R) 7. Zhang

More information

Linear Programming. Course review MS-E2140. v. 1.1

Linear Programming. Course review MS-E2140. v. 1.1 Linear Programming MS-E2140 Course review v. 1.1 Course structure Modeling techniques Linear programming theory and the Simplex method Duality theory Dual Simplex algorithm and sensitivity analysis Integer

More information

Lecture 2 September 3

Lecture 2 September 3 EE 381V: Large Scale Optimization Fall 2012 Lecture 2 September 3 Lecturer: Caramanis & Sanghavi Scribe: Hongbo Si, Qiaoyang Ye 2.1 Overview of the last Lecture The focus of the last lecture was to give

More information

Lecture 2 - Introduction to Polytopes

Lecture 2 - Introduction to Polytopes Lecture 2 - Introduction to Polytopes Optimization and Approximation - ENS M1 Nicolas Bousquet 1 Reminder of Linear Algebra definitions Let x 1,..., x m be points in R n and λ 1,..., λ m be real numbers.

More information

Support Vector Machines.

Support Vector Machines. Support Vector Machines srihari@buffalo.edu SVM Discussion Overview 1. Overview of SVMs 2. Margin Geometry 3. SVM Optimization 4. Overlapping Distributions 5. Relationship to Logistic Regression 6. Dealing

More information

An augmented Lagrangian method for equality constrained optimization with fast infeasibility detection

An augmented Lagrangian method for equality constrained optimization with fast infeasibility detection An augmented Lagrangian method for equality constrained optimization with fast infeasibility detection Paul Armand 1 Ngoc Nguyen Tran 2 Institut de Recherche XLIM Université de Limoges Journées annuelles

More information

Linear Programming. Linear programming provides methods for allocating limited resources among competing activities in an optimal way.

Linear Programming. Linear programming provides methods for allocating limited resources among competing activities in an optimal way. University of Southern California Viterbi School of Engineering Daniel J. Epstein Department of Industrial and Systems Engineering ISE 330: Introduction to Operations Research - Deterministic Models Fall

More information

Detecting Infeasibility in Infeasible-Interior-Point. Methods for Optimization

Detecting Infeasibility in Infeasible-Interior-Point. Methods for Optimization FOCM 02 Infeasible Interior Point Methods 1 Detecting Infeasibility in Infeasible-Interior-Point Methods for Optimization Slide 1 Michael J. Todd, School of Operations Research and Industrial Engineering,

More information

Linear Programming Problems

Linear Programming Problems Linear Programming Problems Two common formulations of linear programming (LP) problems are: min Subject to: 1,,, 1,2,,;, max Subject to: 1,,, 1,2,,;, Linear Programming Problems The standard LP problem

More information