Preface. and Its Applications 81, ISBN , doi: / , Springer Science+Business Media New York, 2013.

Similar documents
Nonlinear Programming

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING

Performance Evaluation of an Interior Point Filter Line Search Method for Constrained Optimization

A Truncated Newton Method in an Augmented Lagrangian Framework for Nonlinear Programming

Contents. Preface CHAPTER III

Introduction to Optimization

STRUCTURAL & MULTIDISCIPLINARY OPTIMIZATION

Composite Self-concordant Minimization

Programming, numerics and optimization

Optimal Control Techniques for Dynamic Walking

Assessing the Potential of Interior Methods for Nonlinear Optimization

25. NLP algorithms. ˆ Overview. ˆ Local methods. ˆ Constrained optimization. ˆ Global methods. ˆ Black-box methods.

Theoretical Concepts of Machine Learning

INTERIOR POINT METHOD BASED CONTACT ALGORITHM FOR STRUCTURAL ANALYSIS OF ELECTRONIC DEVICE MODELS

An augmented Lagrangian method for equality constrained optimization with fast infeasibility detection

A Note on Smoothing Mathematical Programs with Equilibrium Constraints

The AIMMS Outer Approximation Algorithm for MINLP

COMPUTATIONAL DYNAMICS

DETERMINISTIC OPERATIONS RESEARCH

The AIMMS Outer Approximation Algorithm for MINLP

Introduction to Design Optimization

PARALLEL OPTIMIZATION

ME 555: Distributed Optimization

Preprint ANL/MCS-P8xx-0899, August, Mathematics and Computer Science Division. Argonne National Laboratory. Stephen J. Wright?

Aone-phaseinteriorpointmethodfornonconvexoptimization

Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2

Convex Optimization. Lijun Zhang Modification of

A Structured Programming Approach to Data

A Numerical Study of Active-Set and Interior-Point Methods for Bound Constrained Optimization

INTRODUCTION TO LINEAR AND NONLINEAR PROGRAMMING

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction

Fast-Lipschitz Optimization

Computational Methods. Constrained Optimization

Preface. This Book and Simulation Software Bundle Project

1. Introduction. performance of numerical methods. complexity bounds. structural convex optimization. course goals and topics

Constrained Optimization COS 323

Characterizing Improving Directions Unconstrained Optimization

Optimization III: Constrained Optimization

An introduction into numerical optimization with KNITRO

5 Machine Learning Abstractions and Numerical Optimization

Ellipsoid Algorithm :Algorithms in the Real World. Ellipsoid Algorithm. Reduction from general case

Introduction to Optimization

KNITRO NLP Solver. Todd Plantenga, Ziena Optimization, Inc. US-Mexico Workshop on Optimization January 2007, Huatulco, Mexico

A Nonlinear Presolve Algorithm in AIMMS

Jinkun Liu Xinhua Wang. Advanced Sliding Mode Control for Mechanical Systems. Design, Analysis and MATLAB Simulation

NLP++ Optimization Toolbox

NLPLSX: A Fortran Implementation of an SQP Algorithm for Least-Squares Optimization with Very Many Observations - User s Guide -

Metaheuristic Optimization with Evolver, Genocop and OptQuest

Stereo Scene Flow for 3D Motion Analysis

IE598 Big Data Optimization Summary Nonconvex Optimization

COMS 4771 Support Vector Machines. Nakul Verma

A new mini-max, constrained optimization method for solving worst case problems

Convex Analysis and Minimization Algorithms I

Introduction to Machine Learning

Support Vector Machines. James McInerney Adapted from slides by Nakul Verma

A Short SVM (Support Vector Machine) Tutorial

CasADi tutorial Introduction

automatic differentiation techniques used in jump

CONLIN & MMA solvers. Pierre DUYSINX LTAS Automotive Engineering Academic year

Introduction to Constrained Optimization

Crash-Starting the Simplex Method

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

TASK SCHEDULING FOR PARALLEL SYSTEMS

PROJECT REPORT. Parallel Optimization in Matlab. Joakim Agnarsson, Mikael Sunde, Inna Ermilova Project in Computational Science: Report January 2013

Principles of Network Economics

Optimization Methods. Final Examination. 1. There are 5 problems each w i t h 20 p o i n ts for a maximum of 100 points.

A Structured Programming Approach to Data

OPTIMIZATION METHODS

Linear methods for supervised learning

USER S GUIDE FOR TOMLAB /KNITRO v6 1

LOAD SHEDDING AN EFFICIENT USE OF LTC TRANSFORMERS

Constrained and Unconstrained Optimization

Optimized Choice of Parameters in interior-point methods for linear programming. Luiz-Rafael dos Santos

Lecture 15: Log Barrier Method

Augmented Lagrangian Methods

An Introduction to Structural Optimization

Research Interests Optimization:

IE521 Convex Optimization Introduction

California Institute of Technology Crash-Course on Convex Optimization Fall Ec 133 Guilherme Freitas

Inverse KKT Motion Optimization: A Newton Method to Efficiently Extract Task Spaces and Cost Parameters from Demonstrations

A primal-dual framework for mixtures of regularizers

Topological Structure and Analysis of Interconnection Networks

Applied Lagrange Duality for Constrained Optimization

Nonsmooth Optimization and Related Topics

Convex Programs. COMPSCI 371D Machine Learning. COMPSCI 371D Machine Learning Convex Programs 1 / 21

Adaptive Barrier Update Strategies for Nonlinear Interior Methods

3 Interior Point Method

QL: A Fortran Code for Convex Quadratic Programming - User s Guide, Version

From Theory to Application (Optimization and Optimal Control in Space Applications)

COPT: A C++ Open Optimization Library

Linear Optimization and Extensions: Theory and Algorithms

Modeling and Simulation in Scilab/Scicos with ScicosLab 4.4

Convex Optimization CMU-10725

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP):

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

Transcription:

Preface This book is for all those interested in using the GAMS technology for modeling and solving complex, large-scale, continuous nonlinear optimization problems or applications. Mainly, it is a continuation of the book Nonlinear Optimization Applications Using the GAMS Technology, 1 where a number of 82 nonlinear optimization applications are presented in the GAMS language. The purpose of this book is to present the theoretical details and computational performances of the algorithms for solving continuous nonlinear optimization applications integrated in GAMS: MINOS, KNITRO, CONOPT, SNOPT, and IPOPT. The idea is to give a comprehensive description of the theoretical aspects and details on computational numerical linear algebra of the above algorithms working under GAMS specifications. The motivation and the significance of this enterprise are to stimulate the reader to understand and to use the most important capabilities and options of the optimization algorithms working in GAMS, for solving nonlinear complex optimization applications. We consider only the local solutions of large-scale, complex, continuous nonlinear optimization applications. Other types of optimization problems like: mixed-integer programming, mixed complementarity, equilibrium constraints, network optimization, nonsmooth optimization, stochastic programming, and global solution of nonlinear optimization applications, operational in GAMS technology, are beyond the scope of this book. The content of the book is organized in 21 chapters. Chapter 1 has an introductory character by presenting the variety of constrained nonlinear optimization methods, their theoretical support, and the structure of the book. In Chapter 2 we point out some key aspects of the mathematical modeling process in the context of mathematical modeling technologies based on algebraically oriented modeling 1 N. Andrei, Nonlinear Optimization Applications Using the GAMS Technology, Springer Optimization and Its Applications 81, ISBN 978 1 4614-6796-0, doi:10.1007/978 1 4614-6797-7, Springer Science+Business Media New York, 2013. v

vi Preface languages. Chapter 3 is a gentile introduction to the main aspects of the GAMS technology subject to modeling and solving continuous nonlinear optimization applications. GAMS is an algebraically oriented language that allows for a highlevel algebraic representation of mathematical optimization models, a compiler that is responsible for user interactions by compiling and executing user commands given in GAMS source file which also has a set of solvers. In Chapter 4 a number of 18 real nonlinear optimization applications, both in algebraic and GAMS representations, are detailed. These optimization applications are a basis for seeing the performances of the algorithms described in this book. The rest of the chapters present on the one hand the relevant theoretical and computational results on some methods and techniques which proved to be the most effective and robust for solving nonlinear optimization problems and on the other hand detail on the algorithms MINOS, KNITRO, CONOPT, SNOPT, and IPOPT which work in the GAMS technology. Accordingly, in a professional and unique manner, the book emphasizes large-scale optimization techniques, such as penalty and augmented Lagrangian functions, linearly constrained augmented Lagrangian methods, sequential linear and sequential quadratic programming, and interior point methods, all equipped with line-search and trust-region globalization schemes. It is worth mentioning that not only the algorithms working under the GAMS specifications (MINOS, KNITRO, CONOPT, SNOPT, and IPOPT) are described in this book. To illustrate the performances of the methods integrated in the GAMS technology for solving nonlinear optimization problems and for making comparisons, we give some details for some other optimization algorithms based on penalty-barrier methods (SPENBAR), a SQP method using only equalityconstrained sub-problems (DONLP), a sequential quadratic programming algorithm with successive error restoration (NLPQLP), and filter methods with sequential linear programming (filtersd) or with sequential quadratic programming (filtersqp), which are not integrated in GAMS. The purpose of including these algorithms is to emphasize the importance of combining or integrating different optimization techniques, globalization strategies and mechanisms, as well as advanced linear algebraic procedures in order to get efficient and robust nonlinear optimization algorithms. A special attention is given to the Karush-Kuhn-Tucker (KKT) optimality conditions which characterize the local optimal solutions of nonlinear optimization problems. These necessary and sufficient optimality conditions are described in Chapter 5. Since the simple bound optimization problems have a very important role in the economy of nonlinear optimization, in Chapter 6, we present some of the main algorithms for solving this class of problems. In Chapter 7 the theory behind two very important concepts in nonlinear optimization, the penalty and the augmented Lagrangian, is introduced. These concepts are implemented in different forms and combinations in the frame of line search or trust region, leading thus to very efficient and robust algorithms. Chapter 8 is dedicated to the SPENBAR algorithm based on a combination of the augmented Lagrangian with log-barrier function. The penalty parameters included in this augmented Lagrangian function are updated in such a way as to obtain a

Preface vii KKT point for the optimization problem. A different and more sophisticated modification of the augmented Lagrangian is described in Chapter 9, where MINOS, one of the most respectable nonlinear programming algorithms, is described. This algorithm, integrated in GAMS, is based on the linearization of the constraints. In MINOS the modified augmented Lagrangian includes a penalty term based on the departure from the linearity of constraints. The modified augmented Lagrangian sub-problems are solved by using a reduced-gradient algorithm along with a quasi-newton algorithm. Chapter 10 is dedicated to quadratic programming. Both the equalityconstrained quadratic programming and the inequality-constrained quadratic programming are shown, emphasizing the primal-dual active-set method by Goldfarb and Idnani. In Chapter 11 both the equality-constrained sequential quadratic programming and the inequality-constrained sequential quadratic programming with line search or trust region are introduced. Special attention is given to the sequential linear-quadratic programming method, which is one of the main ingredients in some nonlinear-constrained optimization algorithms. A sequential quadratic programming method using only equality-constrained sub-problems DONLP is described in Chap. 12. This is an active-set sequential quadratic programming algorithm. The idea of this algorithm is to consider only the near-active or violated inequalities in the quadratic programming sub-problem and to treat them as equalities. Another sequential quadratic programming algorithm with successive error restoration NLPQLP is presented in Chapter 13. This is also an active-set method, as it provides an estimate of the active set at every iteration. It uses a quasi-newton approximation to the Hessian of the Lagrangian, which is updated with the BFGS formula. To calculate the stepsize that minimizes an augmented Lagrangian merit function, a nonmonotone line search is used. Both the DONLP and the NLPQLP show the diversity of the active-set sequential quadratic programming methods. These are pure sequential quadratic programming methods. In the rest of the chapters, we present only the algorithms integrated in the GAMS technology. In Chapter 14 the active-set sequential linear-quadratic programming KNITRO/ACTIVE is described. The interior point sequential linearquadratic programming KNITRO/INTERIOR is presented in Chapter 19. Both these approaches are integrated in KNITRO, one of the most elaborated algorithms for solving general large-scale nonlinear optimization applications. KNITRO/ ACTIVE is based on an active-set method based on sequential linear-quadratic programming using the projected conjugate gradient iteration. On the other hand, KNITRO/INTERIOR is based on the interior point methods in two implementations: KNITRO/INTERIOR-CG, in which the algorithmic step is computed by means of an iterative conjugate gradient method, and KNITRO/INTE- RIOR-DIRECT, where the step is computed by a direct factorization of the corresponding linear system associated to the interior point method. These two approaches communicate by the crossover technique used for the first time in linear programming by Megiddo.

viii Preface Chapter 15 presents a pure sequential quadratic programming algorithm for large-scale nonlinear-constrained optimization SNOPT. SNOPT implements a particular sequential quadratic programming that exploits sparsity in the constraint Jacobian and maintains a limited-memory quasi-newton approximation to the Hessian of the Lagrangian function. Even if this algorithm contains a lot of very sophisticated ingredients, part of them taken from MINOS, its computational performances are modest, being superseded by other algorithms integrated in the GAMS technology, like KNITRO, CONOPT, and IPOPT. A generalized reduced-gradient algorithm with sequential linearization CONOPT is explained in Chapter 16. This line search algorithm illustrates the importance of combining the generalized reduced-gradient method with sequential linear programming or with sequential quadratic programming. CONOPT integrates three active-set methods. The first one is a gradient projection method in the frame of generalized reduced-gradient method that projects the gradient of the objective function onto a linearization of the constraints. The second one is a sequential linear programming algorithm, while the third is a sequential quadratic programming algorithm. CONOPT includes algorithmic switches that automatically detect which method is the best. Interior point and filter methods are the content of Chaps. 17 and 18, respectively. For the interior point method, a prototype is presented. Both the line-search and the trust-region interior point algorithms are discussed. At the same time, a variant of the line-search interior point algorithm is described in details, where a methodology emphasizing the development and the convergence of the interior point algorithms is given. The filter methods, which is a technique for the globalization of the nonlinear optimization algorithms, aim at avoiding the need to choose penalty parameters in penalty or augmented Lagrangian functions. Two filter methods are described: the sequential linear programming filter algorithm and the sequential quadratic programming one. These chapters represent the fundamentals for some algorithms for nonlinear optimization, which are much elaborated and which combine different strategies based on sequential linear or sequential quadratic programming or on filter line search, in the frame of interior point methods. All these are presented as KNITRO/INTERIOR in Chapter 19 and as IPOPT in Chapter 20. The last chapter contains some numerical studies and comparisons among the algorithms integrated in the GAMS technology. The conclusion is that there is no algorithm able to solve any nonlinearconstrained optimization problem or application. The most powerful nonlinear optimization algorithms combine and integrate different optimization techniques (modified augmented Lagrangian, sequential linear or quadratic programming, interior point methods with filter line search or trust region) and include advanced computational linear algebra techniques. The book is of great interest to all those using the GAMS technology for modeling and solving complex, large-scale, continuous nonlinear optimization problems or applications from different areas of activity. Mathematical programming researchers, theoreticians and practitioners in operations research, management consultants, practitioners in engineering, and basic science and industry

Preface ix researchers will find plenty of information and practical hints at using in an efficient and robust way the continuous nonlinear optimization software integrated in the GAMS technology. The book is also suitable for scientists working in the optimization theory, optimization algorithms design, and computational developments, as well as for college and graduate students in mathematics, science, operations research, and business, who are interested in advanced optimization algorithms and computer optimization software. I am grateful to the Alexander von Humboldt Foundation for its appreciation and financial support during the 2+ years in different universities from Germany. My acknowledgments are also due to Elizabeth Loew and Razia Amzad and their colleagues at Springer, for their encouragement and superb assistance with the preparation of this book. Finally, fondest thanks go to my wife, Mihaela, who has constantly understood, helped, and supported me along the years. Bucharest and Tohăniţa/Bran Resort Bucharest, Romania November 2016 Neculai Andrei

http://www.springer.com/978-3-319-58355-6