TMA946/MAN280 APPLIED OPTIMIZATION. Exam instructions

Similar documents
Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

Optimization Methods. Final Examination. 1. There are 5 problems each w i t h 20 p o i n ts for a maximum of 100 points.

Solution Methods Numerical Algorithms

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach

Applied Lagrange Duality for Constrained Optimization

Simulation. Lecture O1 Optimization: Linear Programming. Saeed Bastani April 2016

Chapter II. Linear Programming

MATHEMATICS II: COLLECTION OF EXERCISES AND PROBLEMS

Lecture 12: Feasible direction methods

Convex Optimization CMU-10725

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP):

NOTATION AND TERMINOLOGY

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs

Linear programming and duality theory

GENERAL ASSIGNMENT PROBLEM via Branch and Price JOHN AND LEI

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

Lecture 1: Introduction

Mathematical and Algorithmic Foundations Linear Programming and Matchings

TIM 206 Lecture Notes Integer Programming

INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING

11 Linear Programming

Math 5593 Linear Programming Lecture Notes

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs

LP-Modelling. dr.ir. C.A.J. Hurkens Technische Universiteit Eindhoven. January 30, 2008

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017

Some Advanced Topics in Linear Programming

MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms

Solutions for Operations Research Final Exam

DETERMINISTIC OPERATIONS RESEARCH

Integer Programming Theory

Financial Optimization ISE 347/447. Lecture 13. Dr. Ted Ralphs

Algorithms for Decision Support. Integer linear programming models

16.410/413 Principles of Autonomy and Decision Making

Unconstrained Optimization Principles of Unconstrained Optimization Search Methods

Chapter 15 Introduction to Linear Programming

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs

Solving linear programming

Tribhuvan University Institute Of Science and Technology Tribhuvan University Institute of Science and Technology

BCN Decision and Risk Analysis. Syed M. Ahmed, Ph.D.

Mathematical Tools for Engineering and Management

1. Lecture notes on bipartite matching February 4th,

POLYHEDRAL GEOMETRY. Convex functions and sets. Mathematical Programming Niels Lauritzen Recall that a subset C R n is convex if

56:272 Integer Programming & Network Flows Final Examination -- December 14, 1998

Part 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm

Real life Problem. Review

Lec13p1, ORF363/COS323

5. DUAL LP, SOLUTION INTERPRETATION, AND POST-OPTIMALITY

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

Linear Programming. L.W. Dasanayake Department of Economics University of Kelaniya

UNIT 2 LINEAR PROGRAMMING PROBLEMS

UNIT 6 MODELLING DECISION PROBLEMS (LP)

3 INTEGER LINEAR PROGRAMMING

Design and Analysis of Algorithms (V)

lpsymphony - Integer Linear Programming in R

Linear Optimization. Andongwisye John. November 17, Linkoping University. Andongwisye John (Linkoping University) November 17, / 25

Linear Programming Motivation: The Diet Problem

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

Algorithms for Integer Programming

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

CS675: Convex and Combinatorial Optimization Spring 2018 Consequences of the Ellipsoid Algorithm. Instructor: Shaddin Dughmi

Approximation Algorithms

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

4 LINEAR PROGRAMMING (LP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

Lecture 19: Convex Non-Smooth Optimization. April 2, 2007

Research Interests Optimization:

Lesson 08 Linear Programming

Characterizing Improving Directions Unconstrained Optimization

Linear and Integer Programming :Algorithms in the Real World. Related Optimization Problems. How important is optimization?

Linear Programming: Introduction

Linear Programming. Larry Blume. Cornell University & The Santa Fe Institute & IHS

Math 5593 Linear Programming Final Exam

Graphing Linear Inequalities in Two Variables.

Bilinear Programming

OPERATIONS RESEARCH. Linear Programming Problem

Outline. Column Generation: Cutting Stock A very applied method. Introduction to Column Generation. Given an LP problem

ISM206 Lecture, April 26, 2005 Optimization of Nonlinear Objectives, with Non-Linear Constraints

Column Generation: Cutting Stock

Submodularity Reading Group. Matroid Polytopes, Polymatroid. M. Pawan Kumar

Linear Programming Duality and Algorithms

1 Linear programming relaxation

Linear Programming. Course review MS-E2140. v. 1.1

Mathematical Programming and Research Methods (Part II)

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Lecture 2 Optimization with equality constraints

Classification of Optimization Problems and the Place of Calculus of Variations in it

Heuristic Optimization Today: Linear Programming. Tobias Friedrich Chair for Algorithm Engineering Hasso Plattner Institute, Potsdam

Decision Aid Methodologies In Transportation Lecture 1: Polyhedra and Simplex method

Resource Management in Computer Networks -- Mapping from engineering problems to mathematical formulations

5.3 Cutting plane methods and Gomory fractional cuts

Ryerson Polytechnic University Department of Mathematics, Physics, and Computer Science Final Examinations, April, 2003

Artificial Intelligence

Lagrangian Relaxation: An overview

Introduction to Optimization Problems and Methods

SUBSTITUTING GOMORY CUTTING PLANE METHOD TOWARDS BALAS ALGORITHM FOR SOLVING BINARY LINEAR PROGRAMMING

Linear Programming in Small Dimensions

Approximation Algorithms

Lecture 25 Nonlinear Programming. November 9, 2009

Transcription:

Chalmers/GU Mathematics EXAM TMA946/MAN280 APPLIED OPTIMIZATION Date: 03 05 28 Time: House V, morning Aids: Text memory-less calculator Number of questions: 7; passed on one question requires 2 points of 3. Questions are not numbered by difficulty. To pass requires 10 points and three passed questions. Examiner: Michael Patriksson Teacher on duty: Richards Grzibovskis (0740-459022) Result announced: 03 06 12 Short answers are also given at the end of the exam on the notice board for optimization in the MD building. Exam instructions When you answer the questions State your methodology carefully. Use generally valid methods and theory. Only write on one page of each sheet. Do not use a red pen. Do not answer more than one question per page. At the end of the exam Sort your solutions by the order of the questions. Mark on the cover the questions you have answered. Count the number of sheets you hand in and fill in the number on the cover.

EXAM TMA946/MAN280 APPLIED OPTIMIZATION 1 Question 1 (The simplex method in linear programming) Consider the following LP problem: minimize z = x 1 2x 2 4x 3 +4x 4 subject to x 2 +2x 3 +x 4 4, 2x 1 +x 2 +x 3 4x 4 5, x 1 x 2 +2x 4 3, x 1, x 2, x 3, x 4 0. (2p) (1p) a) Find an optimal solution, or an extreme half-line along which the objective function diverges to, by using the Simplex method. Some matrix inverses that might come in handy are 1 1 0 4 1 0 2 1 1 1 2 1 1 1 4 1 0 2 1 1 = 1 1 1 0 4 1 0, 3 2 1 3 = 1 2 4 9 2 1 3, 3 1 2 3 1 2 0 0 1 1 0 = 1 1 0 0 1 2 0, 2 0 0 1 0 0 2 1 2 1 0 1 1 0 = 1 1 1 0 1 2 0. 3 0 1 1 1 2 3 b) Find a feasible solution which has the objective value z = 418. Question 2 (Modelling) Suppose that we are interested in describing a feasible region as the union of a finite number of convex sets X i R n, that is, that X := m i=1 X i is the feasible set. (1p) (2p) a) Establish through a counter-example that even though the sets X i all are convex, their union X may not be. b) Suppose that each set X i is a polyhedron, and that the objective function f : R n R is linear. Describe how we can formulate the problem to minimize x X f(x)

EXAM TMA946/MAN280 APPLIED OPTIMIZATION 2 as a type of mixed-integer linear program, where the variables are of two types, binary variables and continuous variables, and where the problem reduces to an LP whenever the binary variables are fixed. (3p) Question 3 (Modelling in linear programming) Nilsson & Nilsson is a small company which manufactures furniture on their own, as well as acting as sub-contractors to other firms. Their main business is to manufacture simple bookshelves made from laminated particle-board (spånskiva). In order to manufacture bookshelves, the firm buys unlaminated particle-boards for b Skr per m 2, and laminates them using their own machines. Pre-laminated boards are available at c Skr per m 2. The boards are cut, and the resulting pieces are assembled into bookshelves. After packaging, the shelves are sold to local stores under the name KALLE for the price of p 1 Skr per item. Due to a limited demand, the company may sell at most k shelves this way. Impressed by their capabilities, IKEA has asked them to become a supplier of their wildly famous BILLY bookshelf, paying them p 2 Skr per shelf. Each KALLE bookshelf requires a 1 m 2 of particle-board whereas BILLY requires a 2 m 2. Since the KALLE bookshelf is well adapted to the firm s equipment, it only takes e 1 minutes to cut the material into the required pieces for one shelf, whereas each BILLY requires e 2 minutes of cutting time. Since the idea with IKEA furniture is that the customers themselves assemble the products, a BILLY shelf requires no assembly, whereas one KALLE requires f minutes of assembly-time. Finally, it takes g 1 minutes to package one KALLE shelf, whereas it takes g 2 minutes to package one BILLY shelf. The company s machines may laminate at most L m 2 of particle-board per month. Two cutting-stations are available to the company, and together they provide 280h of cutting time each month. The company has 16 employees which may split their time between packaging and assembly, and each employee spends 120h per month working (the rest of the time is spent on sick-leave, vacations and so forth). Formulate the problem to maximize the firm s profit under the given constraints.

EXAM TMA946/MAN280 APPLIED OPTIMIZATION 3 Question 4 (The Frank Wolfe algorithm) (2p) a) Consider the following constrained nonlinear minimization problem: minimize (x,y) subject to f(x, y) := x 2 + y 2, { 1 x 2, 1 y 1. Starting at the point (x 0, y 0 ) = (2, 1), perform one step of the Frank Wolfe algorithm. Feel free to plot the problem and perform the algorithmic steps graphically, as long as you describe all the steps in detail with mathematical notation also. Is the point obtained an optimal solution? Why/why not? (1p) b) Consider the nonlinear optimization problem with linear constraints: (P ) { f := minimum x R n f(x), subject to Ax b, where f : R n R is convex and differentiable, A R n m, b R m. The value f is the optimal value of f over the feasible set in the problem (P ). Let further L k denote the optimal value of the Frank Wolfe subproblem at the feasible point x k (where Ax k b): (P L ) { L k := minimum z R n f(x k ) T (z x k ), subject to Az b. Show that f(x k ) f L k always holds. (That is, the optimal value of the Frank Wolfe subproblem allows us to estimate the distance to the optimum in terms of the objective function by providing a lower bound on f.)

EXAM TMA946/MAN280 APPLIED OPTIMIZATION 4 Question 5 (Interior penalty methods in nonlinear programming) Consider the problem to minimize f(x) := x 1 + x 2, subject to g 1 (x) := x 2 1 + x 2 0, g 2 (x) := x 1 0. (1p) (2p) a) What is the globally optimal solution to this problem? Feel free to plot the problem and determine the solution graphically, as long as you can motivate your solution. b) Suppose that we attack this problem with the use of an interior penalty (or, barrier) method, where the barrier function is chosen as φ(g(x)) := log(g(x)). Describe the steps of the algorithm. Further, suppose that we are able to find the globally optimal solution to each unconstrained penalty problem. Describe these optimal solutions, and prove that their sequence converges to the unique optimal solution to the problem. Question 6 (Optimality conditions in linear programming) The semi-assignment problem arises in some relaxation methods for problems in integer programming. Its statement is as follows: n n minimize c ij x ij, subject to i=1 j=1 n x ij = 1, j = 1,..., n, i=1 x ij 0, i = 1,..., n, j = 1,..., n. Its interpretation as describing a semi-assignment becomes clear if one adds x ij {0, 1} for all i and j to the constraints: then it is clear that we can interpret each pair (i, j) as a job to be assigned to a particular machine (or member of staff), where i denotes machine and j denotes job. A semi-assignment is one which assigns precisely one machine to each job. (An assignment is one in which we also request that each machine is assigned exactly one job; a semi-assignment can assign many jobs to the same machine, and no jobs at all to some others.)

EXAM TMA946/MAN280 APPLIED OPTIMIZATION 5 (1p) (2p) a) Prove that the constraints x ij {0, 1} are redundant. That is, establish that by solving the above LP problem, we can automatically make sure that the resulting optimal solution is binary. b) The semi-assignment problem is a very simple LP problem, which can be solved by the following, very simple, greedy algorithm: For each j = 1,..., n, do the following: (1) Find the smallest value of c ij over all i {1,..., n}. Let i j be the index corresponding to that least value. (This corresponds to finding the cheapest machine for the job.) (2) Assign machine i j to job j: let x ij = 1 for i = i j, and x ij = 0 for i i j. The resulting solution is clearly feasible, since it assigns each job to exactly one machine, and the solution x is also non-negative. Prove, by using linear programming duality, in particular the primal dual optimality conditions, that this algorithm is correct, that is, that it does solve the semi-assignment problem. Hint: Write down the LP dual problem, state the primal dual optimality conditions, and show that the solution given by the above algorithm satisfies these conditions.

EXAM TMA946/MAN280 APPLIED OPTIMIZATION 6 (3p) Question 7 (Convexity and optimality) Consider the convex optimization problem to (P ) { minimize x X f(x), where f : R n R is a continuously differentiable function which is convex on the set X; the set X R n is convex and non-empty. We suppose that the set X is compact (that is, closed and bounded), so that the problem is guaranteed to have a non-empty set of optimal solutions, denoted by X. Establish the following interesting result: for all optimal solutions x, the value of the gradient of f is the same! In other words, if x and ˆx both are optimal solutions (that is, are in the set X ), then f(x ) = f(ˆx) holds. Hint: Utilize the convexity of the problem. In particular, utilize that a convex function f is such that f(y) f(x) + f(x) T (y x), x, y R n, holds; this inequality can in fact be used as a definition of the gradient of the differentiable, convex function f at x. Also utilize the following characterization of an optimal solution to the problem (P): a feasible solution x to the convex problem (P) is globally optimal in (P) if and only if f(x ) T (y x ) 0, y X.