PACBB: A Projected Adaptive Cyclic Barzilai-Borwein Method for Box Constrained Optimization*
|
|
- Lillian McDonald
- 5 years ago
- Views:
Transcription
1 PACBB: A Projected Adaptive Cyclic Barzilai-Borwein Method for Box Constrained Optimization* Hongchao Zhang and William W. Hager Department of Mathematics, University of Florida, Gainesville, FL 32611, USA. {hzhang,hager} ufl.edu Summary. The adaptive cyclic Barzilai-Borwein (BB) method [DZ05] for unconstrained optimization is extended to bound constrained optimization. Using test problems from the CUTE library [BCGT95], performance is compared with SPG2 (a BB method), GENCAN (a BB/conjugate gradient scheme), and L-BFGS-B (limited BFGS for bound constrained problems). Key words: box constrained optimization, cyclic Barzilai-Borwein stepsize method, nonmonotone line search 1 Introduction Recently, we developed an adaptive cyclic Barzilai-Borwein (ACBB) method [DZ05] for solving unconstrained optimization problems. In this paper, we explain how the line search can be modified so as to solve bound constrained optimization problems of the form: min f(x), (1) where / is a smooth function, B= {x G 5R" L < x < U}, and L and U are upper and lower bounds, possibly infinite. For the bound constrained problem, the ACBB search direction are projected onto the feasible set. Hence, the new algorithm is denoted PACBB [projected adaptive cyclic Barzilai-Borwein method). A step in the BB method [BB88] is given by 4-iSk-i.. Xk+\=Xk-ak9k, ctk =-y, (2) Sk-iVk-i * This material is based upon work supported by the National Science Foundation under Grant No
2 388 Hongchao Zhang and William W. Hager Fig. 1. The projected line search. where g^ = Vf{xk) is the gradient, viewed as a column vector, Sfc_i = Xk Xk-i, and yu-i = Qk Qk-i' Advantages of BB type methods are their low memory requirements and their simphcity - in a neighborhood of a local minimizer, no Une search is needed since the convergence is hnear (see [DZ05]) when the Hessian is strictly convex at the solution. In the cyclic BB method, the same stepsize au is used repeatedly for several iterations - we observe in [DZ05] that by reusing a step for several iterations, convergence can be accelerated. In the adaptive cyclic BB method, we adaptively adjust the cycle length as the iteration progress. In the projected cyclic Barzilai-Borwein method, we project the ACBB iterates onto the feasible set B and perform a nonmonotone hne search between the current iterate and the projection point using the scheme in [DZ05]. 2 Algorithm The line search is illustrated in Figure 1. We first take a step along the negative gradient to a point Xk = Xk ctkgk, where the initial stepsize ak is a safeguarded version of either the previous stepsize or the newly computed stepsize if the cycle length has been reached. If the point Xk lies outside B, we compute
3 PACBB for Box Constrained Optimization 389 the projection Ps^Xk) of Xk onto B. The search direction dk = Pei^k) ^ ^k is a descent direction since B is convex. A nonmonotone line search is performed along the line segment connecting Xk and PB(xfc). If possible, we accept the point Ptsixk). Otherwise, we backtrack towards Xk- When Xk lies outside of B, the next initial stepsize a^+i is given by the BB formula found in (2). In a forthcoming paper, we prove that when B is replaced by a closed, convex set n, we have liminf Pr2(a;fc ~ gk) -XfcHoo = 0 fc >oo for a gradient projection/nonmonotone hne search scheme with the structure depicted in Figure 1 3 The numerical results In this section, we compare the performance of the projected adaptive cychc BB algorithm (PACBB) to the SPG2 algorithm developed in [BMROO, BMROl], to the GENCAN algorithm developed in [BM02], and to the L- BFGS-B version of the limited BFGS method for box constrained optimization developed in [BLN95, ZBN97]. All codes were written in Fortran and compiled with f77 (default compiler settings) on a Sun workstation. The GENCAN codes were obtained from Jose Martinez's web page: and the L-BFGS-B codes were obtained from Jorge Nocedal's web page: or For each code, we stopped the iterations if either \\PB{Xk-gk)-Xk\\oo<W~'^ (3) \fixk)-fixk.^)\/{l + \fixk)\)<10-''. (4) We also terminated a code if the number of function evaluations was more than 10^. The test set consisted of all bound constrained problems from the (2002) CUTE hbrary [BCGT95] with more than 50 variables. For all problems where more than one choice of the dimension is given, we use the largest dimension. The numerical results are posted at the following web page: Relative to the CPU time, the numerical comparison of PACBB with the other three routines can be summerized as follows: PACBB is faster than SPG2 in 36 problems, while SPG2 is faster in 6 problems.
4 390 Hongchao Zhang and William W. Hager PACBB is faster than GENCAN in 33 problems, while GENCAN is faster in 9 problems. PACBB is faster than L-BFGS-B in 34 problems, while L-BFGS-B is faster in 11 problems. Excluding the problems where the difference in CPU time was less than 10%, the numerical results can be summerized as follows: PACBB is faster than SPG2 in 33 problems, while SPG2 is faster in 4 problems. PACBB is faster than GENCAN in 30 problems, while GENCAN is faster in 8 problems. PACBB is faster than L-BFGS-B in 32 problems, while L-BFGS-B is faster in 9 problems. Figure 2 shows the performance profiles, proposed by Dolan and More [DM02], for the four codes. That is, for the methods analyzed, we plot the Fig. 2. Performance profiles fraction P of problems for which any given method is within a factor r of the best time. In a performance profile plot, the top curve is the method that solved the most problems in a time that was within a factor r of the best time. The percentage of the test problems for which a method is the fastest is given on the left axis of the plot. The right side of the plot gives the percentage of the test problems that were successfully solved by each of the methods. In essence, the right side is a measure of an algorithm's robustness.
5 PACBB for Box Constrained Optimization 391 Since the top curve in Figure 2 corresponds to PACBB, this method yielded the best CPU time performance for this set of 48 test problems with dimensions ranging from 50 to 15,625. Similar to SPG2, the algorithm PACBB is suitable for large dimensional problems due to its low memory requirements. It is pointed out in [BMROO] that for ill-conditioned problems, SPG2 may converge slowly. Although PACBB seems to deal with ill-conditioning better than SPG2, both GENCAN and L-BFGS-B are more efficient for the very ill-conditioned problems. Finally, the PACBB algorithm is very easy to implement and it has many promising applications (see [BCM99, GHR93]). References [BB88] J. Barzilai and J. M. Borwein, Two point step size gradient methods, IMA J. Numer. Anal, 8 (1988), pp [BCM99] E. G. Birgin, I. Chambouleyron, and J. M. Martinez, Estimation of the optical constants and the thickness of thin films using unconstrained optimization, J. Comput. Phys., 151 (1999), pp [BM02] E. G. Birgin and J. M. Martinez, Large-scale active-set box-constrained optimization method with spectral projected gradients, Comput. Optim. Appl., 23 (2002), pp [BMROO] E. G. Birgin, J. M. Martinez, and M. Raydan, Nonmonotone Spectral Projected Gradient Methods for convex sets, SIAM J. Optim., 10 (2000), pp [BMROl] E. G. Birgin, J. M. Martinez and M. Raydan, Algorithm 813: SPG - software for convex-constrained optimization, ACM Trans. Math. Software, 27 (2001), pp [BLN95] R. H. Byrd, P. Lu and J. Nocedal, A Limited Memory Algorithm for Bound Constrained Optimization, SIAM J. Sci. Comput., 16, (1995), pp [BCGT95] I. Bongartz, A. R. Conn, N. I. M. Gould, and P. L. Toint, CUTE: constrained and unconstrained testing environments, ACM Trans. Math. Software, 21 (1995), pp [DZOl] Y. H. Dai and H. Zhang, An Adaptive Two-point Stepsize Gradient Algorithm, Numer. Algorithms, 27 (2001), pp [DZ05] Y. H. Dai, W. W. Hager, K. Schittkowski and H. Zhang, Cyclic Barzilai- Borwein Stepsize Method for Unconstrained Optimization, March, 2005 (see [DM02] E. D. Dolan and J. J. More, Benchmarking optimization software with performance profiles. Math. Prog., 91 (2002), pp [GHR93] W. Glunt, T. L. Hayden, and M. Raydan, Molecular conformations from distance matrices, J. Comput. Chem., 14 (1993), pp [GLL86] L. Grippo, F. Lampariello, and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM J. Numer. Anal., 23 (1986), pp [RayOl] M. Raydan, Nonmonotone spectral methods for large-scale nonlinear systems Report in the International Workshop on "Optimization and Control with Applications", Erice, Italy, July 9-17, 2001
6 392 Hongchao Zhang and William W. Hager [Toi97] Ph. L. Toint, A non-monotone trust region algorithm for nonlinear optimization subject to convex constraints, Math. Prog., 77 (1997), pp [ZBN97] C. Zhu, R. H. Byrd and J. Nocedal, L-BFGS-B: Algorithm 778: L-BFGS-B, FORTRAN routines for large scale bound constrained optimization, ACM Trans. Math. Software, 23 (1997), pp
SPG: Software for Convex-Constrained Optimization
SPG: Software for Convex-Constrained Optimization Ernesto G. Birgin José Mario Martínez Marcos Raydan February 13, 2001 Abstract Fortran 77 software implementing the SPG method is introduced. SPG is a
More informationA NEW EFFICIENT VARIABLE LEARNING RATE FOR PERRY S SPECTRAL CONJUGATE GRADIENT TRAINING METHOD
1 st International Conference From Scientific Computing to Computational Engineering 1 st IC SCCE Athens, 8 10 September, 2004 c IC SCCE A NEW EFFICIENT VARIABLE LEARNING RATE FOR PERRY S SPECTRAL CONJUGATE
More informationStructured minimal-memory inexact quasi-newton method and secant preconditioners for Augmented Lagrangian Optimization
Structured minimal-memory inexact quasi-newton method and secant preconditioners for Augmented Lagrangian Optimization E. G. Birgin J. M. Martínez June 19, 2006 Abstract Augmented Lagrangian methods for
More informationDipartimento di Ingegneria Informatica, Automatica e Gestionale A. Ruberti, SAPIENZA, Università di Roma, via Ariosto, Roma, Italy.
Data article Title: Data and performance profiles applying an adaptive truncation criterion, within linesearchbased truncated Newton methods, in large scale nonconvex optimization. Authors: Andrea Caliciotti
More informationNew algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds
Math. Program., Ser. A 106, 403 421 (2006) Digital Object Identifier (DOI) 10.1007/s10107-005-0595-2 Yu-Hong Dai Roger Fletcher New algorithms for singly linearly constrained quadratic programs subject
More informationComparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles
INTERNATIONAL JOURNAL OF MATHEMATICS MODELS AND METHODS IN APPLIED SCIENCES Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles M. Fernanda P.
More informationAn Algorithm for the Fast Solution of Symmetric Linear Complementarity Problems
An Algorithm for the Fast Solution of Symmetric Linear Complementarity Problems José Luis Morales Jorge Nocedal Mikhail Smelyanskiy August 23, 2008 Abstract This paper studies algorithms for the solution
More informationPerformance Evaluation of an Interior Point Filter Line Search Method for Constrained Optimization
6th WSEAS International Conference on SYSTEM SCIENCE and SIMULATION in ENGINEERING, Venice, Italy, November 21-23, 2007 18 Performance Evaluation of an Interior Point Filter Line Search Method for Constrained
More informationOptimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm
Optimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm Mark Schmidt, Ewout van den Berg, Michael P. Friedlander, and Kevin Murphy Department of Computer
More informationA penalty based filters method in direct search optimization
A penalty based filters method in direct search optimization ALDINA CORREIA CIICESI/ESTG P.PORTO Felgueiras PORTUGAL aic@estg.ipp.pt JOÃO MATIAS CM-UTAD Vila Real PORTUGAL j matias@utad.pt PEDRO MESTRE
More informationClassical Gradient Methods
Classical Gradient Methods Note simultaneous course at AMSI (math) summer school: Nonlin. Optimization Methods (see http://wwwmaths.anu.edu.au/events/amsiss05/) Recommended textbook (Springer Verlag, 1999):
More informationLECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach
LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach Basic approaches I. Primal Approach - Feasible Direction
More informationAn augmented Lagrangian method for equality constrained optimization with fast infeasibility detection
An augmented Lagrangian method for equality constrained optimization with fast infeasibility detection Paul Armand 1 Ngoc Nguyen Tran 2 Institut de Recherche XLIM Université de Limoges Journées annuelles
More informationAccelerated gradient methods for total-variation-based CT image reconstruction
Downloaded from vbn.aau.dk on: April 12, 2019 Aalborg Universitet Accelerated gradient methods for total-variation-based CT image reconstruction Jørgensen, Jakob H.; Jensen, Tobias Lindstrøm; Hansen, Per
More informationAlternating Projections
Alternating Projections Stephen Boyd and Jon Dattorro EE392o, Stanford University Autumn, 2003 1 Alternating projection algorithm Alternating projections is a very simple algorithm for computing a point
More informationA penalty based filters method in direct search optimization
A penalty based filters method in direct search optimization Aldina Correia CIICESI / ESTG P.PORTO Felgueiras, Portugal aic@estg.ipp.pt João Matias CM-UTAD UTAD Vila Real, Portugal j matias@utad.pt Pedro
More informationThird-order derivatives of the Moré, Garbow, and Hillstrom test set problems
Third-order derivatives of the Moré, Garbow, and Hillstrom test set problems E. G. Birgin J. L. Gardenghi J. M. Martínez S. A. Santos April 1, 2018. Abstract The development of Fortran routines for computing
More informationA Numerical Study of Active-Set and Interior-Point Methods for Bound Constrained Optimization
A Numerical Study of Active-Set and Interior-Point Methods for Bound Constrained Optimization Long Hei 1, Jorge Nocedal 2, Richard A. Waltz 2 1 Department of Industrial Engineering and Management Sciences,
More informationNLP++ Optimization Toolbox
NLP++ Optimization Toolbox - Theory Manual - GmbH Fuerther Str. 212 D - 90429 Nuernberg January 22, 2013 Contents 1. General information about NLP++ 1 1.1. General NLP++ Optimization Problem.......................
More informationAssessing the Potential of Interior Methods for Nonlinear Optimization
Assessing the Potential of Interior Methods for Nonlinear Optimization José Luis Morales 1, Jorge Nocedal 2, Richard A. Waltz 2, Guanghui Liu 3, and Jean-Pierre Goux 2 1 Departamento de Matemáticas, Instituto
More informationNOTES ON LIMITED MEMORY BFGS UPDATING IN A TRUST{REGION FRAMEWORK JAMES V. BURKE AND ANDREAS WIEGMANN
NOTES ON LIMITED MEMORY BFGS UPDATING IN A TRUST{REGION FRAMEWORK JAMES V. BURKE AND ANDREAS WIEGMANN Abstract. The limited memory BFGS method pioneered by Jorge Nocedal is usually implemented as a line
More informationProgramming, numerics and optimization
Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June
More informationModule 1 Lecture Notes 2. Optimization Problem and Model Formulation
Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization
More informationarxiv: v1 [cs.na] 28 Dec 2018
arxiv:1812.10986v1 [cs.na] 28 Dec 2018 Vilin: Unconstrained Numerical Optimization Application Marko Miladinović 1, Predrag Živadinović 2, 1,2 University of Niš, Faculty of Sciences and Mathematics, Department
More informationImage registration for motion estimation in cardiac CT
Image registration for motion estimation in cardiac CT Bibo Shi a, Gene Katsevich b, Be-Shan Chiang c, Alexander Katsevich d, and Alexander Zamyatin c a School of Elec. Engi. and Comp. Sci, Ohio University,
More informationEncyclopedia of Optimization Second Edition
Encyclopedia of Optimization Second Edition C. A. Floudas and P. M. Pardalos (Eds.) Encyclopedia of Optimization Second Edition With 613 Figures and 247 Tables 123 CHRISTODOULOS A. FLOUDAS Department of
More informationSolving IK problems for open chains using optimization methods
Proceedings of the International Multiconference on Computer Science and Information Technology pp. 933 937 ISBN 978-83-60810-14-9 ISSN 1896-7094 Solving IK problems for open chains using optimization
More informationChapter 3 Numerical Methods
Chapter 3 Numerical Methods Part 1 3.1 Linearization and Optimization of Functions of Vectors 1 Problem Notation 2 Outline 3.1.1 Linearization 3.1.2 Optimization of Objective Functions 3.1.3 Constrained
More informationLecture 2 September 3
EE 381V: Large Scale Optimization Fall 2012 Lecture 2 September 3 Lecturer: Caramanis & Sanghavi Scribe: Hongbo Si, Qiaoyang Ye 2.1 Overview of the last Lecture The focus of the last lecture was to give
More informationA Lagrange method based L-curve for image restoration
Journal of Physics: Conference Series OPEN ACCESS A Lagrange method based L-curve for image restoration To cite this article: G Landi 2013 J. Phys.: Conf. Ser. 464 012011 View the article online for updates
More informationMulti Layer Perceptron trained by Quasi Newton learning rule
Multi Layer Perceptron trained by Quasi Newton learning rule Feed-forward neural networks provide a general framework for representing nonlinear functional mappings between a set of input variables and
More informationISTITUTO DI ANALISI DEI SISTEMI ED INFORMATICA
ISTITUTO DI ANALISI DEI SISTEMI ED INFORMATICA CONSIGLIO NAZIONALE DELLE RICERCHE G. Di Pillo, S. Lucidi, L. Palagi, M. Roma A CONTROLLED RANDOM SEARCH ALGORITHM WITH LOCAL NEWTON-TYPE SEARCH FOR GLOBAL
More informationEfficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 13, NO. 5, SEPTEMBER 2002 1225 Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms S. Sathiya Keerthi Abstract This paper
More informationOptimization. 1. Optimization. by Prof. Seungchul Lee Industrial AI Lab POSTECH. Table of Contents
Optimization by Prof. Seungchul Lee Industrial AI Lab http://isystems.unist.ac.kr/ POSTECH Table of Contents I. 1. Optimization II. 2. Solving Optimization Problems III. 3. How do we Find x f(x) = 0 IV.
More informationNew Methods for Solving Large Scale Linear Programming Problems in the Windows and Linux computer operating systems
arxiv:1209.4308v1 [math.oc] 19 Sep 2012 New Methods for Solving Large Scale Linear Programming Problems in the Windows and Linux computer operating systems Saeed Ketabchi, Hossein Moosaei, Hossein Sahleh
More informationOptimization. Industrial AI Lab.
Optimization Industrial AI Lab. Optimization An important tool in 1) Engineering problem solving and 2) Decision science People optimize Nature optimizes 2 Optimization People optimize (source: http://nautil.us/blog/to-save-drowning-people-ask-yourself-what-would-light-do)
More informationMATHEMATICAL ANALYSIS, MODELING AND OPTIMIZATION OF COMPLEX HEAT TRANSFER PROCESSES
MATHEMATICAL ANALYSIS, MODELING AND OPTIMIZATION OF COMPLEX HEAT TRANSFER PROCESSES Goals of research Dr. Uldis Raitums, Dr. Kārlis Birģelis To develop and investigate mathematical properties of algorithms
More informationIEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 7, JULY (1) 1 A comprehensive, and frequently updated repository of CS literature and
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 7, JULY 2009 2479 Sparse Reconstruction by Separable Approximation Stephen J. Wright, Robert D. Nowak, Senior Member, IEEE, and Mário A. T. Figueiredo,
More informationIIAIIIIA-II is called the condition number. Similarly, if x + 6x satisfies
SIAM J. ScI. STAT. COMPUT. Vol. 5, No. 2, June 1984 (C) 1984 Society for Industrial and Applied Mathematics OO6 CONDITION ESTIMATES* WILLIAM W. HAGERf Abstract. A new technique for estimating the 11 condition
More informationModern Methods of Data Analysis - WS 07/08
Modern Methods of Data Analysis Lecture XV (04.02.08) Contents: Function Minimization (see E. Lohrmann & V. Blobel) Optimization Problem Set of n independent variables Sometimes in addition some constraints
More informationHSC Mathematics - Extension 1. Workshop E2
HSC Mathematics - Extension Workshop E Presented by Richard D. Kenderdine BSc, GradDipAppSc(IndMaths), SurvCert, MAppStat, GStat School of Mathematics and Applied Statistics University of Wollongong Moss
More informationL-BFGS-B { FORTRAN SUBROUTINES FOR LARGE-SCALE BOUND CONSTRAINED OPTIMIZATION. Ciyou Zhu 1,Richard H.Byrd 2, Peihuang Lu 1 and Jorge Nocedal 1
NORTHWESTERN UNIVERSITY Department of Electrical Engineering and Computer Science L-BFGS-B { FORTRAN SUBROUTINES FOR LARGE-SCALE BOUND CONSTRAINED OPTIMIZATION by Ciyou Zhu 1,Richard H.Byrd 2, Peihuang
More informationEllipsoid Algorithm :Algorithms in the Real World. Ellipsoid Algorithm. Reduction from general case
Ellipsoid Algorithm 15-853:Algorithms in the Real World Linear and Integer Programming II Ellipsoid algorithm Interior point methods First polynomial-time algorithm for linear programming (Khachian 79)
More information4.12 Generalization. In back-propagation learning, as many training examples as possible are typically used.
1 4.12 Generalization In back-propagation learning, as many training examples as possible are typically used. It is hoped that the network so designed generalizes well. A network generalizes well when
More informationTracking Minimum Distances between Curved Objects with Parametric Surfaces in Real Time
Tracking Minimum Distances between Curved Objects with Parametric Surfaces in Real Time Zhihua Zou, Jing Xiao Department of Computer Science University of North Carolina Charlotte zzou28@yahoo.com, xiao@uncc.edu
More informationIterative Shrinkage/Thresholding g Algorithms: Some History and Recent Development
Iterative Shrinkage/Thresholding g Algorithms: Some History and Recent Development Mário A. T. Figueiredo Instituto de Telecomunicações and Instituto Superior Técnico, Technical University of Lisbon PORTUGAL
More informationREAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN
REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION Nedim TUTKUN nedimtutkun@gmail.com Outlines Unconstrained Optimization Ackley s Function GA Approach for Ackley s Function Nonlinear Programming Penalty
More informationAxial block coordinate descent (ABCD) algorithm for X-ray CT image reconstruction
Axial block coordinate descent (ABCD) algorithm for X-ray CT image reconstruction Jeffrey A. Fessler and Donghwan Kim EECS Department University of Michigan Fully 3D Image Reconstruction Conference July
More informationA NEW SEQUENTIAL CUTTING PLANE ALGORITHM FOR SOLVING MIXED INTEGER NONLINEAR PROGRAMMING PROBLEMS
EVOLUTIONARY METHODS FOR DESIGN, OPTIMIZATION AND CONTROL P. Neittaanmäki, J. Périaux and T. Tuovinen (Eds.) c CIMNE, Barcelona, Spain 2007 A NEW SEQUENTIAL CUTTING PLANE ALGORITHM FOR SOLVING MIXED INTEGER
More informationContents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.
page v Preface xiii I Basics 1 1 Optimization Models 3 1.1 Introduction... 3 1.2 Optimization: An Informal Introduction... 4 1.3 Linear Equations... 7 1.4 Linear Optimization... 10 Exercises... 12 1.5
More informationDISTRIBUTED NETWORK RESOURCE ALLOCATION WITH INTEGER CONSTRAINTS. Yujiao Cheng, Houfeng Huang, Gang Wu, Qing Ling
DISTRIBUTED NETWORK RESOURCE ALLOCATION WITH INTEGER CONSTRAINTS Yuao Cheng, Houfeng Huang, Gang Wu, Qing Ling Department of Automation, University of Science and Technology of China, Hefei, China ABSTRACT
More informationNewton and Quasi-Newton Methods
Lab 17 Newton and Quasi-Newton Methods Lab Objective: Newton s method is generally useful because of its fast convergence properties. However, Newton s method requires the explicit calculation of the second
More informationcontrol polytope. These points are manipulated by a descent method to compute a candidate global minimizer. The second method is described in Section
Some Heuristics and Test Problems for Nonconvex Quadratic Programming over a Simplex Ivo Nowak September 3, 1998 Keywords:global optimization, nonconvex quadratic programming, heuristics, Bezier methods,
More informationPROJECTION ONTO A POLYHEDRON THAT EXPLOITS SPARSITY
PROJECTION ONTO A POLYHEDRON THAT EXPLOITS SPARSITY WILLIAM W. HAGER AND HONGCHAO ZHANG Abstract. An algorithm is developed for projecting a point onto a polyhedron. The algorithm solves a dual version
More informationMultivariate Numerical Optimization
Jianxin Wei March 1, 2013 Outline 1 Graphics for Function of Two Variables 2 Nelder-Mead Simplex Method 3 Steepest Descent Method 4 Newton s Method 5 Quasi-Newton s Method 6 Built-in R Function 7 Linear
More informationOptimization Plugin for RapidMiner. Venkatesh Umaashankar Sangkyun Lee. Technical Report 04/2012. technische universität dortmund
Optimization Plugin for RapidMiner Technical Report Venkatesh Umaashankar Sangkyun Lee 04/2012 technische universität dortmund Part of the work on this technical report has been supported by Deutsche Forschungsgemeinschaft
More informationUniversity of Twente. Faculty of Mathematical Sciences. Convexity preservation of the four-point interpolatory subdivision scheme
Faculty of Mathematical Sciences University of Twente University for Technical and Social Sciences P.O. Box 17 7500 AE Enschede The Netherlands Phone: +31-53-4893400 Fax: +31-53-4893114 Email: memo@math.utwente.nl
More informationNumerical Experiments with a Population Shrinking Strategy within a Electromagnetism-like Algorithm
Numerical Experiments with a Population Shrinking Strategy within a Electromagnetism-like Algorithm Ana Maria A. C. Rocha and Edite M. G. P. Fernandes Abstract This paper extends our previous work done
More informationLecture 15: Log Barrier Method
10-725/36-725: Convex Optimization Spring 2015 Lecturer: Ryan Tibshirani Lecture 15: Log Barrier Method Scribes: Pradeep Dasigi, Mohammad Gowayyed Note: LaTeX template courtesy of UC Berkeley EECS dept.
More informationMachine Learning for Signal Processing Lecture 4: Optimization
Machine Learning for Signal Processing Lecture 4: Optimization 13 Sep 2015 Instructor: Bhiksha Raj (slides largely by Najim Dehak, JHU) 11-755/18-797 1 Index 1. The problem of optimization 2. Direct optimization
More informationAlgorithms for convex optimization
Algorithms for convex optimization Michal Kočvara Institute of Information Theory and Automation Academy of Sciences of the Czech Republic and Czech Technical University kocvara@utia.cas.cz http://www.utia.cas.cz/kocvara
More informationIMPLEMENTATION OF A FIXING STRATEGY AND PARALLELIZATION IN A RECENT GLOBAL OPTIMIZATION METHOD
IMPLEMENTATION OF A FIXING STRATEGY AND PARALLELIZATION IN A RECENT GLOBAL OPTIMIZATION METHOD Figen Öztoprak, Ş.İlker Birbil Sabancı University Istanbul, Turkey figen@su.sabanciuniv.edu, sibirbil@sabanciuniv.edu
More informationAugmented Lagrangian Methods
Augmented Lagrangian Methods Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Augmented Lagrangian IMA, August 2016 1 /
More informationSurrogate Gradient Algorithm for Lagrangian Relaxation 1,2
Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2 X. Zhao 3, P. B. Luh 4, and J. Wang 5 Communicated by W.B. Gong and D. D. Yao 1 This paper is dedicated to Professor Yu-Chi Ho for his 65th birthday.
More informationAPPLICATION OF VARIABLE-FIDELITY MODELS TO AERODYNAMIC OPTIMIZATION
Applied Mathematics and Mechanics (English Edition), 2006, 27(8):1089 1095 c Editorial Committee of Appl. Math. Mech., ISSN 0253-4827 APPLICATION OF VARIABLE-FIDELITY MODELS TO AERODYNAMIC OPTIMIZATION
More informationGeometry Optimization Made Simple with Translation and Rotation Coordinates
Geometry Optimization Made Simple with Translation and Rotation Coordinates Lee-Ping Wang 1 and Chenchen Song 2, 3 1) Department of Chemistry, University of California; 1 Shields Ave; Davis, CA 95616.
More informationLecture 12: convergence. Derivative (one variable)
Lecture 12: convergence More about multivariable calculus Descent methods Backtracking line search More about convexity (first and second order) Newton step Example 1: linear programming (one var., one
More informationPRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction
PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING KELLER VANDEBOGERT AND CHARLES LANNING 1. Introduction Interior point methods are, put simply, a technique of optimization where, given a problem
More informationA projected Hessian matrix for full waveform inversion Yong Ma and Dave Hale, Center for Wave Phenomena, Colorado School of Mines
A projected Hessian matrix for full waveform inversion Yong Ma and Dave Hale, Center for Wave Phenomena, Colorado School of Mines SUMMARY A Hessian matrix in full waveform inversion (FWI) is difficult
More informationDelaunay-based Derivative-free Optimization via Global Surrogate. Pooriya Beyhaghi, Daniele Cavaglieri and Thomas Bewley
Delaunay-based Derivative-free Optimization via Global Surrogate Pooriya Beyhaghi, Daniele Cavaglieri and Thomas Bewley May 23, 2014 Delaunay-based Derivative-free Optimization via Global Surrogate Pooriya
More informationWE consider the gate-sizing problem, that is, the problem
2760 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I: REGULAR PAPERS, VOL 55, NO 9, OCTOBER 2008 An Efficient Method for Large-Scale Gate Sizing Siddharth Joshi and Stephen Boyd, Fellow, IEEE Abstract We consider
More informationImage Registration using Constrained Optimization
Image Registration using Constrained Optimization Jeongtae Kim and Jeffrey A. Fessler Information Electronics Dept., Ewha Womans University, Korea EECS Dept., The University of Michigan, Ann Arbor SIAM
More informationA Moving Mesh Method for Time dependent Problems Based on Schwarz Waveform Relaxation
A Moving Mesh Method for Time dependent Problems Based on Schwarz Waveform Relaation Ronald D. Haynes, Weizhang Huang 2, and Robert D. Russell 3 Acadia University, Wolfville, N.S., Canada. ronald.haynes@acadiau.ca
More informationDELFT UNIVERSITY OF TECHNOLOGY
DELFT UNIVERSITY OF TECHNOLOGY REPORT 11-08 A hybrid-optimization method for large-scale non-negative full regualarization in image restoration Johana Guerrero, Marcos Raydan, and Marielba Rojas ISSN 1389-6520
More informationA CONJUGATE DIRECTION IMPLEMENTATION OF THE BFGS ALGORITHM WITH AUTOMATIC SCALING. Ian D Coope
i A CONJUGATE DIRECTION IMPLEMENTATION OF THE BFGS ALGORITHM WITH AUTOMATIC SCALING Ian D Coope No. 42 December 1987 A CONJUGATE DIRECTION IMPLEMENTATION OF THE BFGS ALGORITHM WITH AUTOMATIC SCALING IAN
More informationConvex Optimization / Homework 2, due Oct 3
Convex Optimization 0-725/36-725 Homework 2, due Oct 3 Instructions: You must complete Problems 3 and either Problem 4 or Problem 5 (your choice between the two) When you submit the homework, upload a
More informationGreedy Gossip with Eavesdropping
Greedy Gossip with Eavesdropping Deniz Üstebay, Mark Coates, and Michael Rabbat Department of Electrical and Computer Engineering McGill University, Montréal, Québec, Canada Email: deniz.ustebay@mail.mcgill.ca,
More informationA Moving Mesh Method for Time Dependent Problems based on Schwarz Waveform Relaxation
A Moving Mesh Method for Time Dependent Problems based on Schwarz Waveform Relaation Ronald D. Haynes, Weizhang Huang 2, and Robert D. Russell 3 Acadia University, Wolfville, N.S., Canada ronald.haynes@acadiau.ca
More informationNumerical Method in Optimization as a Multi-stage Decision Control System
Numerical Method in Optimization as a Multi-stage Decision Control System B.S. GOH Institute of Mathematical Sciences University of Malaya 50603 Kuala Lumpur MLYSI gohoptimum@gmail.com bstract: - Numerical
More informationConstrained and Unconstrained Optimization
Constrained and Unconstrained Optimization Carlos Hurtado Department of Economics University of Illinois at Urbana-Champaign hrtdmrt2@illinois.edu Oct 10th, 2017 C. Hurtado (UIUC - Economics) Numerical
More informationInternational Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research)
International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) International Journal of Emerging Technologies in Computational
More informationA Derivative-Free Approximate Gradient Sampling Algorithm for Finite Minimax Problems
1 / 33 A Derivative-Free Approximate Gradient Sampling Algorithm for Finite Minimax Problems Speaker: Julie Nutini Joint work with Warren Hare University of British Columbia (Okanagan) III Latin American
More informationEnergy Minimization -Non-Derivative Methods -First Derivative Methods. Background Image Courtesy: 3dciencia.com visual life sciences
Energy Minimization -Non-Derivative Methods -First Derivative Methods Background Image Courtesy: 3dciencia.com visual life sciences Introduction Contents Criteria to start minimization Energy Minimization
More informationPreface. and Its Applications 81, ISBN , doi: / , Springer Science+Business Media New York, 2013.
Preface This book is for all those interested in using the GAMS technology for modeling and solving complex, large-scale, continuous nonlinear optimization problems or applications. Mainly, it is a continuation
More informationConvex Programs. COMPSCI 371D Machine Learning. COMPSCI 371D Machine Learning Convex Programs 1 / 21
Convex Programs COMPSCI 371D Machine Learning COMPSCI 371D Machine Learning Convex Programs 1 / 21 Logistic Regression! Support Vector Machines Support Vector Machines (SVMs) and Convex Programs SVMs are
More informationDerivative Free Optimization Methods: A Brief, Opinionated, and Incomplete Look at a Few Recent Developments
Derivative Free Optimization Methods: A Brief, Opinionated, and Incomplete Look at a Few Recent Developments Margaret H. Wright Computer Science Department Courant Institute of Mathematical Sciences New
More informationarxiv: v1 [cs.cv] 2 May 2016
16-811 Math Fundamentals for Robotics Comparison of Optimization Methods in Optical Flow Estimation Final Report, Fall 2015 arxiv:1605.00572v1 [cs.cv] 2 May 2016 Contents Noranart Vesdapunt Master of Computer
More informationPacking circle items in an arbitrary marble slab
IOP Conference Series: Materials Science and Engineering PAPER OPEN ACCESS Packing circle items in an arbitrary marble slab To cite this article: Z Yuan et al 018 IOP Conf. Ser.: Mater. Sci. Eng. 399 01059
More informationFrequency Scaling and Energy Efficiency regarding the Gauss-Jordan Elimination Scheme on OpenPower 8
Frequency Scaling and Energy Efficiency regarding the Gauss-Jordan Elimination Scheme on OpenPower 8 Martin Köhler Jens Saak 2 The Gauss-Jordan Elimination scheme is an alternative to the LU decomposition
More informationRecent Developments in Model-based Derivative-free Optimization
Recent Developments in Model-based Derivative-free Optimization Seppo Pulkkinen April 23, 2010 Introduction Problem definition The problem we are considering is a nonlinear optimization problem with constraints:
More informationNOTATION AND TERMINOLOGY
15.053x, Optimization Methods in Business Analytics Fall, 2016 October 4, 2016 A glossary of notation and terms used in 15.053x Weeks 1, 2, 3, 4 and 5. (The most recent week's terms are in blue). NOTATION
More informationNumerical Optimization
Numerical Optimization Quantitative Macroeconomics Raül Santaeulàlia-Llopis MOVE-UAB and Barcelona GSE Fall 2018 Raül Santaeulàlia-Llopis (MOVE-UAB,BGSE) QM: Numerical Optimization Fall 2018 1 / 46 1 Introduction
More information2882 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 6, JUNE NeNMF: An Optimal Gradient Method for Nonnegative Matrix Factorization
2882 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 6, JUNE 2012 NeNMF: An Optimal Gradient Method for Nonnegative Matrix Factorization Naiyang Guan, Dacheng Tao, Senior Member, IEEE, Zhigang Luo,
More informationCharacterizing Improving Directions Unconstrained Optimization
Final Review IE417 In the Beginning... In the beginning, Weierstrass's theorem said that a continuous function achieves a minimum on a compact set. Using this, we showed that for a convex set S and y not
More informationHeuristic Algorithms for Multiconstrained Quality-of-Service Routing
244 IEEE/ACM TRANSACTIONS ON NETWORKING, VOL 10, NO 2, APRIL 2002 Heuristic Algorithms for Multiconstrained Quality-of-Service Routing Xin Yuan, Member, IEEE Abstract Multiconstrained quality-of-service
More informationTested Paradigm to Include Optimization in Machine Learning Algorithms
Tested Paradigm to Include Optimization in Machine Learning Algorithms Aishwarya Asesh School of Computing Science and Engineering VIT University Vellore, India International Journal of Engineering Research
More informationConvexity and Optimization
Convexity and Optimization Richard Lusby Department of Management Engineering Technical University of Denmark Today s Material Extrema Convex Function Convex Sets Other Convexity Concepts Unconstrained
More informationLecture 19 Subgradient Methods. November 5, 2008
Subgradient Methods November 5, 2008 Outline Lecture 19 Subgradients and Level Sets Subgradient Method Convergence and Convergence Rate Convex Optimization 1 Subgradients and Level Sets A vector s is a
More information1. Introduction. performance of numerical methods. complexity bounds. structural convex optimization. course goals and topics
1. Introduction EE 546, Univ of Washington, Spring 2016 performance of numerical methods complexity bounds structural convex optimization course goals and topics 1 1 Some course info Welcome to EE 546!
More informationA FAST ALGORITHM FOR SPARSE RECONSTRUCTION BASED ON SHRINKAGE, SUBSPACE OPTIMIZATION AND CONTINUATION. January, 2009
A FAST ALGORITHM FOR SPARSE RECONSTRUCTION BASED ON SHRINKAGE, SUBSPACE OPTIMIZATION AND CONTINUATION ZAIWEN WEN, WOTAO YIN, DONALD GOLDFARB, AND YIN ZHANG January, 29 Abstract. We propose a fast algorithm
More information