Duality. Primal program P: Maximize n. Dual program D: Minimize m. j=1 c jx j subject to n. j=1. i=1 b iy i subject to m. i=1

Similar documents
Mathematical and Algorithmic Foundations Linear Programming and Matchings

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36

Linear Programming Motivation: The Diet Problem

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017

Civil Engineering Systems Analysis Lecture XIV. Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics

Linear Programming. Larry Blume. Cornell University & The Santa Fe Institute & IHS

Section Notes 4. Duality, Sensitivity, and the Dual Simplex Algorithm. Applied Math / Engineering Sciences 121. Week of October 8, 2018

Solving Linear and Integer Programs

Notes for Lecture 20

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs

6.854 Advanced Algorithms. Scribes: Jay Kumar Sundararajan. Duality

11 Linear Programming

AM 121: Intro to Optimization Models and Methods Fall 2017

Linear Programming Duality and Algorithms

Linear Programming. Course review MS-E2140. v. 1.1

Linear Programming. Linear programming provides methods for allocating limited resources among competing activities in an optimal way.

College of Computer & Information Science Fall 2007 Northeastern University 14 September 2007

Outline. Combinatorial Optimization 2. Finite Systems of Linear Inequalities. Finite Systems of Linear Inequalities. Theorem (Weyl s theorem :)

1. Lecture notes on bipartite matching February 4th,

Heuristic Optimization Today: Linear Programming. Tobias Friedrich Chair for Algorithm Engineering Hasso Plattner Institute, Potsdam

Recap, and outline of Lecture 18

Lecture 5: Duality Theory

CSE 101- Winter 18 Discussion Section Week 8

Homework 2: Multi-unit combinatorial auctions (due Nov. 7 before class)

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

5. DUAL LP, SOLUTION INTERPRETATION, AND POST-OPTIMALITY

Lecture Notes 2: The Simplex Algorithm

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 29

In this chapter we introduce some of the basic concepts that will be useful for the study of integer programming problems.

Approximation Algorithms

Some Advanced Topics in Linear Programming

Econ 172A - Slides from Lecture 2

Lecture 16 October 23, 2014

Lecture Overview. 2 Shortest s t path. 2.1 The LP. 2.2 The Algorithm. COMPSCI 530: Design and Analysis of Algorithms 11/14/2013

Integer Programming Theory

Introduction to Operations Research

Notes taken by Mea Wang. February 11, 2005

LECTURES 3 and 4: Flows and Matchings

Solutions for Operations Research Final Exam

Lecture 7. s.t. e = (u,v) E x u + x v 1 (2) v V x v 0 (3)

Math 5490 Network Flows

1. Lecture notes on bipartite matching

Linear programming and duality theory

AMS : Combinatorial Optimization Homework Problems - Week V

Discrete Optimization 2010 Lecture 5 Min-Cost Flows & Total Unimodularity

Submodularity Reading Group. Matroid Polytopes, Polymatroid. M. Pawan Kumar

Linear Programming Problems

Econ 172A - Slides from Lecture 8

Linear Programming: Introduction

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

R n a T i x = b i} is a Hyperplane.

Approximation Algorithms: The Primal-Dual Method. My T. Thai

Lecture 10,11: General Matching Polytope, Maximum Flow. 1 Perfect Matching and Matching Polytope on General Graphs

Math 5593 Linear Programming Lecture Notes

Advanced Algorithms Linear Programming

Repetition: Primal Dual for Set Cover

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs

Easter Term OPTIMIZATION

Artificial Intelligence

ACO Comprehensive Exam October 12 and 13, Computability, Complexity and Algorithms

Read: H&L chapters 1-6

Math 414 Lecture 30. The greedy algorithm provides the initial transportation matrix.

4.1 The original problem and the optimal tableau

Approximation Algorithms

MA4254: Discrete Optimization. Defeng Sun. Department of Mathematics National University of Singapore Office: S Telephone:

New Directions in Linear Programming

MVE165/MMG631 Linear and integer optimization with applications Lecture 7 Discrete optimization models and applications; complexity

The Simplex Algorithm

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

In this lecture, we ll look at applications of duality to three problems:

The Simplex Algorithm with a New. Primal and Dual Pivot Rule. Hsin-Der CHEN 3, Panos M. PARDALOS 3 and Michael A. SAUNDERS y. June 14, 1993.

Chapter II. Linear Programming

Finite Math Linear Programming 1 May / 7

Planarity: dual graphs

Math Introduction to Operations Research

Conic Duality. yyye

3. The Simplex algorithmn The Simplex algorithmn 3.1 Forms of linear programs

Algorithmic Game Theory and Applications. Lecture 6: The Simplex Algorithm

Combinatorial Optimization

Lecture 14: Linear Programming II

Lecture 6: Faces, Facets

4 LINEAR PROGRAMMING (LP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

Dual-fitting analysis of Greedy for Set Cover

Lesson 11: Duality in linear programming

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize.

ORIE 6300 Mathematical Programming I September 2, Lecture 3

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

Detecting Infeasibility in Infeasible-Interior-Point. Methods for Optimization

CSE 460. Today we will look at" Classes of Optimization Problems" Linear Programming" The Simplex Algorithm"

ORF 307: Lecture 14. Linear Programming: Chapter 14: Network Flows: Algorithms

George B. Dantzig Mukund N. Thapa. Linear Programming. 1: Introduction. With 87 Illustrations. Springer

Applied Lagrange Duality for Constrained Optimization

Improved Gomory Cuts for Primal Cutting Plane Algorithms

Lagrangian Relaxation: An overview

Problem set 2. Problem 1. Problem 2. Problem 3. CS261, Winter Instructor: Ashish Goel.

VARIANTS OF THE SIMPLEX METHOD

Econ 172A - Slides from Lecture 9

Linear Programming Algorithms [Read Chapters G and H first.] Status: Half-finished.

Polytopes Course Notes

Transcription:

Duality Primal program P: Maximize n j=1 c jx j subject to n a ij x j b i, i = 1, 2,..., m j=1 x j 0, j = 1, 2,..., n Dual program D: Minimize m i=1 b iy i subject to m a ij x j c j, j = 1, 2,..., n i=1 y j 0, i = 1, 2,..., m T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 1 / 21

The Duality Theorem Weak Duality Theorem If x is a feasible solution to P and y is a feasible solution to D then the value c T x is smaller than the value b T y. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 2 / 21

The Duality Theorem Weak Duality Theorem If x is a feasible solution to P and y is a feasible solution to D then the value c T x is smaller than the value b T y. (Strong) Duality Theorem If P has an optimal solution x then D has an optimal solution y and c T x = b T y. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 2 / 21

Finding Optimal Dual Solution from Primal Dictionary Primal program P: Maximize c T x under Ax b, x 0. Solve P using two phase simplex method, obtaining optimal solution x. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 3 / 21

Finding Optimal Dual Solution from Primal Dictionary Primal program P: Maximize c T x under Ax b, x 0. Solve P using two phase simplex method, obtaining optimal solution x. Last row of last dictionary: n+m z = z + c k x k. k=1 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 3 / 21

Finding Optimal Dual Solution from Primal Dictionary Primal program P: Maximize c T x under Ax b, x 0. Solve P using two phase simplex method, obtaining optimal solution x. Last row of last dictionary: Let y i = c n+i, i = 1, 2.,..., m. Then: n+m z = z + c k x k. k=1 1 y is a feasible solution to D: Minimize b T y under A T y c, y 0. 2 b T y = z. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 3 / 21

Explaining the magic... Primal P: Maximize 5x 1 + 4x 2 + 3x 3 Subject to 2x 1 + 3x 2 + x 3 5 4x 1 + x 2 + 2x 3 11 3x 1 + 4x 2 + 2x 3 8 x 1, x 2, x 3 0 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 4 / 21

Explaining the magic... Primal P: Maximize 5x 1 + 4x 2 + 3x 3 Subject to 2x 1 + 3x 2 + x 3 5 4x 1 + x 2 + 2x 3 11 3x 1 + 4x 2 + 2x 3 8 x 1, x 2, x 3 0 Dual D: Minimize 5y 1 + 11y 2 + 8y 3 Subject to 2y 1 + 4y 2 + 4 3 5 3y 1 + y 2 + 4y 3 4 y 1 + 2y 2 + 2y 3 3 y 1, y 2, y 3 0 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 4 / 21

Pivoting both primal and dual dictionary Primal: x 4 = 5 2x 1 3x 2 x 3 x 5 = 11 4x 1 x 2 2x 3 x 6 = 8 3x 1 4x 2 2x 3 z = 5x 1 + 4x 2 + 3x 3 Dual: (Not feasible!) y 4 = 5 + 2y 1 + 4y 2 + 3y 3 y 5 = 4 + 3y 1 + y 2 + 4y 3 y 6 = 3 + y 1 + 2y 2 + 2y 3 w = 5y 1 + 11y 2 + 8y 3 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 5 / 21

Pivoting x 1 and x 4 (and y 4 and y 1 ) Primal: x 1 = 5/2 1/2x 4 3/2x 2 1/2x 3 x 5 = 1 + 2x 4 + 5x 2 x 6 = 1/2 + 3/2x 4 + 1/2x 2 1/2x 3 z = 25/2 5/2x 4 7/2x 2 + 1/2x 3 Dual: (Still not feasible!) y 1 = 5/2 + 1/2y 4 2y 2 3/2y 3 y 5 = 7/2 + 3/2y 4 5y 2 1/2y 3 y 6 = 1/2 + 1/2y 4 + 1/2y 3 w = 25/2 + 5/2y 4 + y 2 + 1/2y 3 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 6 / 21

Pivoting x 3 and x 6 (and y 6 and y 3 ) Primal: Optimal! x 1 = 2 2x 4 2x 2 + x 6 x 5 = 1 + 2x 4 + 5x 2 x 3 = 1 + 3x 4 + x 2 2x 6 z = 13 x 4 3x 2 x 6 Dual: Feasible! (and optimal) y 1 = 1 + 2y 4 2y 2 3y 6 y 5 = 3 + 2y 4 5y 2 y 6 y 3 = 1 y 4 + 2y 6 w = 13 + 2y 4 + y 2 + y 6 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 7 / 21

Generalized Duality Theorem Primal program P: Maximize c 1 x 1 + c 2 x 2 + c 3 x 3 subject to y 1 : a 11 x 1 + a 12 x 2 + a 13 x 3 b 1 y 2 : a 21 x 1 + a 22 x 2 + a 23 x 3 b 2 y 3 : a 31 x 1 + a 32 x 2 + a 33 x 3 = b 3 x 1 0, x 2 0, x 3 UIS Dual program D: Minimize b 1 y 1 + b 2 y 2 + b 3 y 3 subject to x 1 : a 11 y 1 + a 21 y 2 + a 31 y 3 c 1 x 2 : a 12 y 1 + a 22 y 2 + a 32 y 3 c 2 x 3 : a 13 y 1 + a 23 y 2 + a 33 y 3 = c 3 y 1 0, y 2 0, y 3 UIS T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 8 / 21

Rules for Taking the Dual Constraints in primal corresponds to variables in dual (and vice versa) Coefficients of objective function in primal corresponds to constants in constraints in dual (and vice versa) Primal (Maximization) Dual (Minimization) for constraint 0 for variable for constraint 0 for variable = for constraint UIS for variable 0 for variable for constraint 0 for varaible for constraint UIS for variable = for constraint T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 9 / 21

Interpreting Duality A diet problem Buy food items in order to satisfy daily intake of energy, protein and calcium, minimizing the cost. Food Serving size Energy Protein Calcium Price Oatmeal 28 110 4 2 3 Chicken 100 205 32 12 24 Eggs 2 160 13 54 13 Whole milk 237 160 8 285 9 Cherry pie 170 420 4 22 20 Pork with beans 260 260 14 80 19 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 10 / 21

Linear Programming Formulation Minimize 3x 1 + 24x 2 + 13x 3 + 9x 4 + 20x 5 + 19x 6 subject to 110x 1 +205x 2 +160x 3 +160x 4 +420x 5 +260x 6 2000 4x 1 + 32x 2 + 13x 3 + 8x 4 + 4x 5 + 14x 6 55 2x 1 + 12x 2 + 54x 3 +285x 4 + 22x 5 + 80x 6 800 x 1 0,..., x 6 0 Take the dual... T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 11 / 21

Linear Programming Formulation Minimize 3x 1 + 24x 2 + 13x 3 + 9x 4 + 20x 5 + 19x 6 subject to 110x 1 +205x 2 +160x 3 +160x 4 +420x 5 +260x 6 2000 4x 1 + 32x 2 + 13x 3 + 8x 4 + 4x 5 + 14x 6 55 2x 1 + 12x 2 + 54x 3 +285x 4 + 22x 5 + 80x 6 800 x 1 0,..., x 6 0 Take the dual... Maximize 2000y 1 + 55y 2 + 800y 3 subject to 110y 1 + 4y 2 + 2y 3 3 205y 1 + 32y 2 + 12y 3 24 160y 1 + 13y 2 + 54y 3 13 160y 1 + 13y 2 + 54y 3 9 420y 1 + 4y 2 + 22y 3 20 260y 1 + 14y 2 + 80y 3 19 y 1 0, y 2 0, y 3 0 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 11 / 21

Interpreting the Dual Maximize 2000y 1 + 55y 2 + 800y 3 subject to 110y 1 + 4y 2 + 2y 3 3 205y 1 + 32y 2 + 12y 3 24 160y 1 + 13y 2 + 54y 3 13 160y 1 + 13y 2 + 54y 3 9 420y 1 + 4y 2 + 22y 3 20 260y 1 + 14y 2 + 80y 3 19 y 1 0, y 2 0, y 3 0 How large can we price energy, protein and calcium? (Shadow prices) T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 12 / 21

Max Flow - Min Cut Let A be the vertex arc incidence matrix, i.e, 1 if e = (u, v) A u,e = 1 if e = (v, u) 0 otherwise Let d be the vector defined by d s = 1, d t = 1, and d i = 0 otherwise. Then we can express the max flow problem with following linear program. Maximize v subject to Af + dv = 0 f c f 0 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 13 / 21

Max Flow - Min Cut (II) Maximize v subject to Af + dv = 0 f c f 0 Take the dual... T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 14 / 21

Max Flow - Min Cut (II) Maximize v subject to Af + dv = 0 f c f 0 Take the dual... Minimize uv E g uvc(u, v) subject to p u p v + g uv 0 uv E p s + p t 1 g uv 0 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 14 / 21

Certifying Optimality Suppose we wish to prove to someone that a solution x to a linear program P is optimal. If we supply both the optimal solution x and an optimal solution y to the dual D, then one may easily verify that the solution x is fact optimal! T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 15 / 21

Certifying Optimality Suppose we wish to prove to someone that a solution x to a linear program P is optimal. If we supply both the optimal solution x and an optimal solution y to the dual D, then one may easily verify that the solution x is fact optimal! What if the linear program P is infeasible. How can we easily convince someone this is the case? T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 15 / 21

Certifying Optimality Suppose we wish to prove to someone that a solution x to a linear program P is optimal. If we supply both the optimal solution x and an optimal solution y to the dual D, then one may easily verify that the solution x is fact optimal! What if the linear program P is infeasible. How can we easily convince someone this is the case? Farkas Lemma Exactly one of the following is true: There exist x such that Ax b. There exist y 0 such that A T y = 0 but b T y < 0. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 15 / 21

Proof of Farkas Lemma Farkas Lemma Exactly one of the following is true: There exist x such that Ax b. There exist y 0 such that A T y = 0 but b T y < 0. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 16 / 21

Proof of Farkas Lemma Farkas Lemma Exactly one of the following is true: There exist x such that Ax b. There exist y 0 such that A T y = 0 but b T y < 0. Both cannot be true: 0 = 0 T x = (A T y) T x = y T Ax y T b = b T y < 0 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 16 / 21

Proof of Farkas Lemma Farkas Lemma Exactly one of the following is true: There exist x such that Ax b. There exist y 0 such that A T y = 0 but b T y < 0. Both cannot be true: 0 = 0 T x = (A T y) T x = y T Ax y T b = b T y < 0 We must show that both cannot be false. We do that be showing that if (1) is false then (2) is true. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 16 / 21

Proof of Farkas Lemma (II) Consider the program P and it s dual D: P: Maximize 0 subject to Ax b D: Minimize bt y subject to A T y = 0, y 0 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 17 / 21

Proof of Farkas Lemma (II) Consider the program P and it s dual D: P: Maximize 0 subject to Ax b D: Minimize bt y subject to A T y = 0, y 0 If P is infeasible then D is either infeasible or unbounded. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 17 / 21

Proof of Farkas Lemma (II) Consider the program P and it s dual D: P: Maximize 0 subject to Ax b D: Minimize bt y subject to A T y = 0, y 0 If P is infeasible then D is either infeasible or unbounded. D is always feasible. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 17 / 21

Proof of Farkas Lemma (II) Consider the program P and it s dual D: P: Maximize 0 subject to Ax b D: Minimize bt y subject to A T y = 0, y 0 If P is infeasible then D is either infeasible or unbounded. D is always feasible. Hence, if P is infeasible, then D is unbounded. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 17 / 21

Proof of Farkas Lemma (II) Consider the program P and it s dual D: P: Maximize 0 subject to Ax b D: Minimize bt y subject to A T y = 0, y 0 If P is infeasible then D is either infeasible or unbounded. D is always feasible. Hence, if P is infeasible, then D is unbounded. Thus we may find a solution y such that b T y < 0. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 17 / 21

Proof of Farkas Lemma (II) Consider the program P and it s dual D: P: Maximize 0 subject to Ax b D: Minimize bt y subject to A T y = 0, y 0 If P is infeasible then D is either infeasible or unbounded. D is always feasible. Hence, if P is infeasible, then D is unbounded. Thus we may find a solution y such that b T y < 0. Farkas Lemma follows from the Strong Duality Theorem. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 17 / 21

Proof of Farkas Lemma (II) Consider the program P and it s dual D: P: Maximize 0 subject to Ax b D: Minimize bt y subject to A T y = 0, y 0 If P is infeasible then D is either infeasible or unbounded. D is always feasible. Hence, if P is infeasible, then D is unbounded. Thus we may find a solution y such that b T y < 0. Farkas Lemma follows from the Strong Duality Theorem. We can also prove the Strong Duality Theorem from Farkas Lemma! T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 17 / 21

Proving Strong Duality from Farkas Lemma P: Maximize ct x subject to Ax b, x 0 (Strong) Duality Theorem D: Minimize bt y subject to A T y c, y 0 If P has an optimal solution x then D has an optimal solution y and c T x = b T y. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 18 / 21

Proving Strong Duality from Farkas Lemma P: Maximize ct x subject to Ax b, x 0 (Strong) Duality Theorem D: Minimize bt y subject to A T y c, y 0 If P has an optimal solution x then D has an optimal solution y and c T x = b T y. By weak duality we just have to provide feasible y such that b T y c T x. If no such y did exist then the following system is infeasible: A T c b T I y c x 0 T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 18 / 21

Proving Strong Duality from Farkas Lemma (II) If the following system is infeasible A T c b T I y c x 0 x by Farkas Lemma there exists λ 0 such that s [ A ] x b I λ = 0 s and [ c T c T x 0 ] x λ < 0 s T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 19 / 21

Proving Strong Duality from Farkas Lemma (III) Rewriting, and [ ] x A b I λ = 0 s [ c T c T x 0 ] x λ < 0 s we have x 0, λ 0, and s 0 such that Ax + λb Is = 0 and c T x + λc T x < 0. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 20 / 21

Proving Strong Duality from Farkas Lemma (III) Rewriting, and [ ] x A b I λ = 0 s [ c T c T x 0 ] x λ < 0 s we have x 0, λ 0, and s 0 such that Ax + λb Is = 0 and c T x + λc T x < 0. This implies: λc T x < c T x. Ax λb. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 20 / 21

Proving Strong Duality from Farkas Lemma (IV) We have x 0 and λ 0 such that λc T x < c T x. Ax λb. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 21 / 21

Proving Strong Duality from Farkas Lemma (IV) We have x 0 and λ 0 such that λc T x < c T x. Ax λb. Case 1: λ 0. Normalize x by λ. Then x is a better solution than x contradiction. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 21 / 21

Proving Strong Duality from Farkas Lemma (IV) We have x 0 and λ 0 such that λc T x < c T x. Ax λb. Case 1: λ 0. Normalize x by λ. Then x is a better solution than x contradiction. Case 2: λ = 0. Then there exist x 0 such that Ax 0 and c T x > 0. Then we can improve x in the direction of x, and obtain a better solution than x contradiction. T. D. Hansen (Aarhus) Optimization, Lecture 8 February 18, 2011 21 / 21