Convexity and Optimization

Similar documents
Convexity and Optimization

MTAEA Convexity and Quasiconvexity

Unconstrained Optimization

EC422 Mathematical Economics 2

Convex sets and convex functions

EC 521 MATHEMATICAL METHODS FOR ECONOMICS. Lecture 2: Convex Sets

Lecture 5: Properties of convex sets

Convex sets and convex functions

Lecture 2 Optimization with equality constraints

Convexity Theory and Gradient Methods

Lecture 4: Convexity

COM Optimization for Communications Summary: Convex Sets and Convex Functions

Lecture 2 September 3

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization

Lecture 25 Nonlinear Programming. November 9, 2009

Characterizing Improving Directions Unconstrained Optimization

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods

Numerical Optimization

5 Day 5: Maxima and minima for n variables.

Introduction to Modern Control Systems

Compact Sets. James K. Peterson. September 15, Department of Biological Sciences and Department of Mathematical Sciences Clemson University

(1) Given the following system of linear equations, which depends on a parameter a R, 3x y + 5z = 2 4x + y + (a 2 14)z = a + 2

Mathematical Programming and Research Methods (Part II)

Lecture 5: Duality Theory

Convex Optimization MLSS 2015

EC5555 Economics Masters Refresher Course in Mathematics September Lecture 6 Optimization with equality constraints Francesco Feri

CMU-Q Lecture 9: Optimization II: Constrained,Unconstrained Optimization Convex optimization. Teacher: Gianni A. Di Caro

14.5 Directional Derivatives and the Gradient Vector

Introduction to optimization

Convexity: an introduction

Local and Global Minimum

Linear Programming. Larry Blume. Cornell University & The Santa Fe Institute & IHS

Lecture 2: August 29, 2018

Applied Lagrange Duality for Constrained Optimization

Walheer Barnabé. Topics in Mathematics Practical Session 2 - Topology & Convex

Minima, Maxima, Saddle points

Affine function. suppose f : R n R m is affine (f(x) =Ax + b with A R m n, b R m ) the image of a convex set under f is convex

LECTURE 18 - OPTIMIZATION

CS675: Convex and Combinatorial Optimization Fall 2014 Convex Functions. Instructor: Shaddin Dughmi

2. Optimization problems 6

Conic Duality. yyye

CME307/MS&E311 Optimization Theory Summary

Chapter 3 Numerical Methods

REVIEW OF FUZZY SETS

Lecture 2: August 29, 2018

Key points. Assume (except for point 4) f : R n R is twice continuously differentiable. strictly above the graph of f except at x

This lecture: Convex optimization Convex sets Convex functions Convex optimization problems Why convex optimization? Why so early in the course?

Constrained and Unconstrained Optimization

Machine Learning for Signal Processing Lecture 4: Optimization

Convexization in Markov Chain Monte Carlo

CME307/MS&E311 Theory Summary

Convex Sets (cont.) Convex Functions

Lagrangian Multipliers

Convex Optimization. 2. Convex Sets. Prof. Ying Cui. Department of Electrical Engineering Shanghai Jiao Tong University. SJTU Ying Cui 1 / 33

Lecture 12: Feasible direction methods

Differential Geometry: Circle Patterns (Part 1) [Discrete Conformal Mappinngs via Circle Patterns. Kharevych, Springborn and Schröder]

Math 5593 Linear Programming Lecture Notes

Lecture 19 Subgradient Methods. November 5, 2008

Lecture 2: August 31

Lecture 3: Convex sets

Lecture 4: Linear Programming

4 Integer Linear Programming (ILP)

DC Programming: A brief tutorial. Andres Munoz Medina Courant Institute of Mathematical Sciences

3.3 Optimizing Functions of Several Variables 3.4 Lagrange Multipliers

AM 221: Advanced Optimization Spring 2016

f x How can we determine algebraically where f is concave up and where f is concave down?

ORIE 6300 Mathematical Programming I September 2, Lecture 3

Tutorial on Convex Optimization for Engineers

In other words, we want to find the domain points that yield the maximum or minimum values (extrema) of the function.

Chapter 4 Convex Optimization Problems

Convexity I: Sets and Functions

MATHEMATICS II: COLLECTION OF EXERCISES AND PROBLEMS

IE 521 Convex Optimization

6 Randomized rounding of semidefinite programs

Numerical Optimization: Introduction and gradient-based methods

Convexity. 1 X i is convex. = b is a hyperplane in R n, and is denoted H(p, b) i.e.,

Convex Optimization Lecture 2

Programming, numerics and optimization

Simplex Algorithm in 1 Slide

General properties of staircase and convex dual feasible functions

Convex Optimization - Chapter 1-2. Xiangru Lian August 28, 2015

a) Alternative Optima, b) Infeasible(or non existing) solution, c) unbounded solution.

Linear programming and duality theory

Generalized Nash Equilibrium Problem: existence, uniqueness and

Constrained Optimization and Lagrange Multipliers

( ) y #S, where! "[0,1]. In other words, if you connect any two

Chapter 6. Curves and Surfaces. 6.1 Graphs as Surfaces

3. The Simplex algorithmn The Simplex algorithmn 3.1 Forms of linear programs

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction

21-256: Lagrange multipliers

ROUGH MEMBERSHIP FUNCTIONS: A TOOL FOR REASONING WITH UNCERTAINTY

AM 221: Advanced Optimization Spring 2016

Solution Methods Numerical Algorithms

Chapter III.C Analysis of The Second Derivative Draft Version 4/19/14 Martin Flashman 2005

Week 5. Convex Optimization

Lecture 2 - Introduction to Polytopes

x 6 + λ 2 x 6 = for the curve y = 1 2 x3 gives f(1, 1 2 ) = λ actually has another solution besides λ = 1 2 = However, the equation λ

Efficient Optimization for L -problems using Pseudoconvexity

Constrained Optimization

Lagrangian Multipliers

Transcription:

Convexity and Optimization Richard Lusby Department of Management Engineering Technical University of Denmark

Today s Material Extrema Convex Function Convex Sets Other Convexity Concepts Unconstrained Optimization R Lusby (42111) Convexity & Optimization 2/38

Extrema Problem max f (x) s.t. x S max {f (x) : x S} Global maximum x f (x ) f (x) x S Local maximum x o : f (x o ) f (x) x in a neighborhood around x o strict maximum/minimum defined similarly R Lusby (42111) Convexity & Optimization 3/38

Weierstrass theorem: Theorem A continuous function achieves its max/min on a closed and bounded set R Lusby (42111) Convexity & Optimization 4/38

Supremum and Infimum Supremum The supremum of a set S having a partial order is the least upper bound of S (if it exists) and is denoted sup S. Infimum The infimum of a set S having a partial order is the greatest lower bound of S (if it exists) and is denoted inf S. If the extrema are not achieved: max sup min inf Examples sup{2, 3, 4, 5}? sup{x Q : x 2 < 2}? inf{1/x : x > }? R Lusby (42111) Convexity & Optimization 5/38

Finding Optimal Solutions Necessary and Sufficient Conditions Every method for finding and characterizing optimal solutions is based on optimality conditions - either necessary or sufficient Necessary Condition A condition C 1 (x) is necessary if C 1 (x ) is satisfied by every optimal solution x (and possibly some other solutions as well). Sufficient Condition A condition C 2 (x) is sufficient if C 2 (x ) ensures that x is optimal (but some optimal solutions may not satisfy C 2 (x ). Mathematically {x C 2 (x)} {x x optimal solution } {x C 1 (x)} R Lusby (42111) Convexity & Optimization 6/38

Finding Optimal Solutions An example of a necessary condition in the case S is well-behaved no improving feasible direction Feasible Direction Consider x o S, s R n is called a feasible direction if there exists ɛ(s) > such that x o + ɛs S ɛ : < ɛ ɛ(s) We denote the cone of feasible directions from x o in S as S(x o ) Improving Direction s R n is called an improving direction if there exists ɛ(s) > such that f (x o + ɛs) < f (x o ) ɛ : < ɛ ɛ(s) The cone of improving directions from x o in S is denoted F (x o ) R Lusby (42111) Convexity & Optimization 7/38

Finding Optimal Solutions Local Optima If x o is a local minimum, then there exist no s S(x o ) for which f(.) decreases along s, i.e. for which f (x 1 + ɛ 2 s) < f (x + ɛ 1 s) for ɛ 1 < ɛ 2 ɛ(s) Stated otherwise: A necessary condition for local optimality is F (x o ) S(x o ) = R Lusby (42111) Convexity & Optimization 8/38

Improving Feasible Directions If for a given direction s it holds that Then s is an improving direction f (x o )s < Well known necessary condition for local optimality of x o for a differentiable function: f (x o ) = In other words, x o is stationary with respect to f (.) R Lusby (42111) Convexity & Optimization 9/38

What if Stationarity is not Enough? Suppose f is twice continuously differentiable Analyse the Hessian matrix for f at x o { 2 f (x o 2 (f (x o } )) ) =, i, j = 1,..., n x i x j Sufficient Condition If f (x o ) = and 2 f (x o ) is positive definite: x T 2 f (x o )x > x R n \{} then x o is a local minimum R Lusby (42111) Convexity & Optimization 1/38

What if Stationarity is not Enough? continued A necessary condition for local optimality is Stationarity + positive semidefiniteness of 2 f (x o ) Note that positive definiteness is not a necessary condition E.g. look at f (x) = x 4 for x o = Similar statements hold for maximization problems Key concept here is negative definiteness R Lusby (42111) Convexity & Optimization 11/38

Definiteness of a Matrix A number of criteria regarding the definiteness of a matrix exist A symmetric n n matrix A is positive definite if and only if x T Ax > x R n \{} Positive semidefinite is defined likewise with instead of > Negative (semi) definite is defined by reversing the inequality signs to < and, respectively. Necessary conditions for positive definiteness: A is regular with det(a) > A 1 is positive definite R Lusby (42111) Convexity & Optimization 12/38

Definiteness of a Matrix Necessary+Sufficient conditions for positive definiteness: Sylvestor s Criterion: All principal submatrices have positive determinants ( ) a 11 a 12 a 13 a 11 a 12 (a 11 ) a 21 a 22 a 23 a 21 a 22 a 31 a 32 a 33 All eigenvalues of A are positive R Lusby (42111) Convexity & Optimization 13/38

Necessary and Sufficient Conditions Differentiable f (.) Theorem Suppose that f (.) is differentiable at a local minimum x o. Then f (x o )s for s S(x o ). If f (.) is twice differentiable at x o and f (x o ) =, then s T 2 f (x o )s s S(x o ) Theorem Suppose that S is convex and non-empty, f (.) differentiable, and that x o S. Suppose furthermore that f (.) is convex. x o is a local minimum if and only if x o is a global minimum x o is a local (and hence global) minimum if and only if f (x o )(x x o ) x S R Lusby (42111) Convexity & Optimization 14/38

Convex Combination Convex combination The convex combination of two points is the line segment between them α 1 x 1 + α 2 x 2 for α 1, α 2 and α 1 + α 2 = 1 R Lusby (42111) Convexity & Optimization 15/38

Convex function Convex Functions A convex function lies below its chord f (α 1 x 1 + α 2 x 2 ) α 1 f (x 1 ) + α 2 f (x 2 ) A strictly convex function has no more than one minimum Examples: y = x 2, y = x 4, y = x The sum of convex functions is also convex A differentiable convex function lies above its tangent A differentiable function is convex if its Hessian is positive semi-definite Strictly convex not analogous! A function f is concave iff f is convex EOQ objective function: T (Q) = dk/q + cd + hq/2? R Lusby (42111) Convexity & Optimization 16/38

Economic Order Quantity Model The problem The Economic Order Quantity Model is an inventory model that helps manafacturers, retailers, and wholesalers determine how they should optimally replenish their stock levels. Costs K = Setup cost for odering one batch c = unit cost for producing/purchasing h = holding cost per unit per unit of time in inventory Assumptions d = A known constant demand rate Q = The order quantity (arrives all at once) Planned shortages are not allowed R Lusby (42111) Convexity & Optimization 17/38

Convex sets Definition A convex set contains all convex combinations of its elements α 1 x 1 + α 2 x 2 S x 1, x 2 S Some examples of E.g. (1, 2], x 2 + y 2 < 4, Level curve (2 dimensions): {(x, y) : f (x, y) = β} Level set: {x : f (x) β} R Lusby (42111) Convexity & Optimization 18/38

Lower Level Set Example Plot of 2x 2 + 4y 2 15 f (x) 1 1 5 4 2 x 2 4 4 2 y 2 4 R Lusby (42111) Convexity & Optimization 19/38

Lower Level Set Example Plot of 2x 2 + 4y 2 4 2 14 13 12 11 4 3 5 1 6 9 8 7 5 4 1 9 8 7 6 3 2 1 5 4 9 8 7 6 14 13 12 11 y 2 4 5 14 13 2 4 12 6 3 7 8 9 1 11 5 1 2 4 6 7 8 3 9 1 5 1 2 11 4 12 3 6 13 4 2 2 4 x 5 7 8 9 1 14 R Lusby (42111) Convexity & Optimization 2/38

Upper Level Set Example Plot of 54x + 9x 2 + 78y 13y 2 f (x) 2 1 15 1 2 x 4 2 y 4 5 R Lusby (42111) Convexity & Optimization 21/38

Upper Level Set Example 5 1 4 8 12 Plot of 54x + 9x 2 + 78y 13y 2 14 16 18 16 12 14 18 3 16 y 2 1 12 2 14 4 1 8 6 16 12 14 18 1 16 12 8 14 18 1 1 2 3 4 5 x 6 16 R Lusby (42111) Convexity & Optimization 22/38

Example Z= 7xy exp(x 2 +y 2 ) 1.5 Z 1 1 2 x 2 2 y 2.5 1 R Lusby (42111) Convexity & Optimization 23/38

Example Z= 7xy exp(x 2 +y 2 ) 2 1.2.6 1.2 1.4.2.6 1.2 1.4 y 1 2.2.4.2.4.6.2.6.8 1.2.6 1.4.8.2.2.2.2.4.6.2.6 1.2.4.8 1.6.8.4.2.2 2 1 1 2 x R Lusby (42111) Convexity & Optimization 24/38

Convexity, Concavity, and Optima Constrained optimization problems Theorem Suppose that S is convex and that f (x) is convex on S for the problem min x S f (x), then If x is locally minimal, then x is globally minimal The set X of global optimal solutions is convex If f is strictly convex, then x is unique R Lusby (42111) Convexity & Optimization 25/38

Examples Problem 1 Minimize x 2 lnx 1 + x 1 9 + x2 2 Subject to: 1. x 1 5..6 x 2 3.6 R Lusby (42111) Convexity & Optimization 26/38

What does the function look like? Plot of x 2 ln(x 1 ) + x 1 9 + x 2 2 25 2 2 15 f (x) 1 2 3 3 x 1 4 2 x 2 5 1 4 5 1 5 R Lusby (42111) Convexity & Optimization 27/38

Problem 2 Minimize Subject to: 3 i=1 ilnx i 3 i=1 x i = 6 x i 3.5 i = 1, 2, 3 x i 1.5 i = 1, 2, 3 R Lusby (42111) Convexity & Optimization 28/38

Class exercises Show that f (x) = x = i x i 2 is convex Prove that any level set of a convex function is a convex set R Lusby (42111) Convexity & Optimization 29/38

Other Types of Convexity The idea of pseudoconvexity of a function is to extend the class of functions for which stationarity is a sufficient condition for global optimality. If f is defined on an open set X and is differentiable we define the concept of pseudoconvexity. A differentiable function f is pseudoconvex if f (x) (x x) f (x ) f (x) x, x X or alternatively.. f (x ) < f (x) f (x)(x x) < x, x X A function f is pseudoconcave iff f is pseudoconvex Note that if f is convex and differentiable, and X is open, then f is also pseudoconvex R Lusby (42111) Convexity & Optimization 3/38

Other Types of Convexity A function is quasiconvex if all lower level sets are convex That is, the following sets are convex S = {x : f (x) β} A function is quasiconcave if all upper level sets are convex That is, the following sets are convex S = {x : f (x) β} Note that if f is convex and differentiable, and X is open, then f is also quasiconvex Convexity properties Convex pseudoconvex quasiconvex R Lusby (42111) Convexity & Optimization 31/38

Exercises Show the following f (x) = x + x 3 is pseudoconvex but not convex f (x) = x 3 is quasiconvex but not pseudoconvex R Lusby (42111) Convexity & Optimization 32/38

Graphically Plot of x 3 + x and x 3 4 2 f (x) 2 4 1.5 1.5.5 1 1.5 x R Lusby (42111) Convexity & Optimization 33/38

Exercises Convexity Questions Can a function be both convex and concave? Is a convex function of a convex function convex? Is a convex combination of convex functions convex? Is the intersection of convex sets convex? R Lusby (42111) Convexity & Optimization 34/38

Unconstrained problem min f (x) s.t. x R n Necessary optimality condition for x o to be a local minimum f (x o ) = and H(x o ) is positive semidefinite Sufficient optimality condition for x o to be a local minimum f (x o ) = and H(x o ) is positive definite Necessary and sufficient Suppose f is pseudoconvex x is a global minimum iff f (x ) = R Lusby (42111) Convexity & Optimization 35/38

Unconstrained example min f (x) = (x 2 1) 3 f (x) = 6x(x 2 1) 2 = for x =, ±1 H(x) = 24x 2 (x 2 1) + 6(x 2 1) 2 H() = 6 and H(±1) = Therefore x = is a local minimum (actually the global minimum) x = ±1 are saddle points R Lusby (42111) Convexity & Optimization 36/38

What does the function look like? 2 f (x)=(x 2 1) 3 1 f (x) 1 1.5 1.5.5 1 1.5 x R Lusby (42111) Convexity & Optimization 37/38

Class Exercise Problem Suppose A is an m n matrix, b is a given m vector, find min Ax b 2 R Lusby (42111) Convexity & Optimization 38/38