Solving basis pursuit: infeasible-point subgradient algorithm, computational comparison, and improvements
|
|
- Merry Allison
- 5 years ago
- Views:
Transcription
1 Solving basis pursuit: infeasible-point subgradient algorithm, computational comparison, and improvements Andreas Tillmann ( joint work with D. Lorenz and M. Pfetsch ) Technische Universität Braunschweig SIAM Conference on Applied Linear Algebra 2012 A. Tillmann 1 / 22
2 Outline 1 Motivation 2 Infeasible-Point Subgradient Algorithm ISAL1 3 Comparison of l 1 -Solvers Testset Construction and Computational Results Improvements with Heuristic Optimality Check 4 Possible Future Research A. Tillmann 2 / 22
3 Outline Motivation 1 Motivation 2 Infeasible-Point Subgradient Algorithm ISAL1 3 Comparison of l 1 -Solvers Testset Construction and Computational Results Improvements with Heuristic Optimality Check 4 Possible Future Research A. Tillmann 3 / 22
4 Motivation Sparse Recovery via l 1 -Minimization Seek sparsest solution to underdetermined linear system: min x 0 s. t. Ax = b (A R m n, m < n) Finding minimum-support solution is N P-hard. Convex relaxation: l 1 -minimization / Basis Pursuit: min x 1 s. t. Ax = b (L1) Several conditions (RIP, Nullspace Property, etc) ensure l 0 -l 1 -equivalence A. Tillmann 4 / 22
5 Motivation Solving the Basis Pursuit Problem (L1) can be recast as a linear program Broad variety of specialized algorithms for (L1) A classic algorithm from nonsmooth optimization: (projected) subgradient method competitive? Which algorithm is the best? A. Tillmann 5 / 22
6 Outline Infeasible-Point Subgradient Algorithm 1 Motivation 2 Infeasible-Point Subgradient Algorithm ISAL1 3 Comparison of l 1 -Solvers Testset Construction and Computational Results Improvements with Heuristic Optimality Check 4 Possible Future Research A. Tillmann 6 / 22
7 Infeasible-Point Subgradient Algorithm Projected Subgradient Methods Problem: min f (x) s.t. x F (f, F convex) standard projected subgradient iteration x k+1 = P F ( x k α k h k ), α k > 0, h k f (x k ) applicability: only reasonable if projection is easy A. Tillmann 7 / 22
8 Infeasible-Point Subgradient Algorithm Projected Subgradient Methods Problem: min f (x) s.t. x F (f, F convex) standard projected subgradient iteration x k+1 = P F ( x k α k h k ), α k > 0, h k f (x k ) applicability: only reasonable if projection is easy idea: replace exact projection by approximation infeasible subgradient iteration x k+1 = P ε k F (x k α k h k ), P ε k F P F 2 ε k A. Tillmann 7 / 22
9 Infeasible-Point Subgradient Algorithm ISAL1 ISA = Infeasible-Point Subgradient Algorithm... works for arbitrary convex objectives and constraint sets... incorporates adaptive approximate projections P ε F such that P ε F (y) P F(y) 2 ε for every ε 0... converges to optimality (under reasonable assumptions) whenever projection accuracies (ε k ) suciently small, for stepsizes α k > 0 with k=0 α k =, k=0 α2 k < for dynamic stepsizes α k = λ k (f (x k ) ϕ ) / h k 2 2 with ϕ ϕ... converges to ϕ with dynamic stepsizes using ϕ ϕ A. Tillmann 8 / 22
10 Infeasible-Point Subgradient Algorithm ISAL1 = Spezialization of ISA to l 1 -Minimization ISAL1 f (x) = x 1, F = { x Ax = b }, sign(x) x 1 exact projected subgradient step for (L1): ( ) x k+1 = P F x k α k h k ) = (x k α k h k ) A T (AA T ) (A(x 1 k α k h k ) b A. Tillmann 9 / 22
11 Infeasible-Point Subgradient Algorithm ISAL1 = Spezialization of ISA to l 1 -Minimization ISAL1 f (x) = x 1, F = { x Ax = b }, sign(x) x 1 exact projected subgradient step for (L1): y k z k x k α k h k Solution of AA T z = Ay k b x k+1 y k A T z k = P F (y k ) AA T is s.p.d. may employ CG to solve equation system A. Tillmann 9 / 22
12 Infeasible-Point Subgradient Algorithm ISAL1 = Spezialization of ISA to l 1 -Minimization ISAL1 f (x) = x 1, F = { x Ax = b }, sign(x) x 1 inexact projected subgradient step for (L1): y k z k x k+1 x k α k h k Solution of AA T z Ay k b y k A T z k = P ε k F (y k ) AA T is s.p.d. may employ CG to solve equation system Approximation: Stop after a few CG iterations (CG residual norm σ min (A) ε k P ε k F ts ISA framework) A. Tillmann 9 / 22
13 Outline Comparison of l 1 -Solvers 1 Motivation 2 Infeasible-Point Subgradient Algorithm ISAL1 3 Comparison of l 1 -Solvers Testset Construction and Computational Results Improvements with Heuristic Optimality Check 4 Possible Future Research A. Tillmann 10 / 22
14 Our Testset Comparison of l 1 -Solvers Testset Construction and Computational Results 100 matrices A (74 dense, 26 sparse) dense: 512 {1024, 1536, 2048, 4096} 1024 {2048, 3072, 4096, 8192} sparse: 2048 {4096, 6144, 8192, 12288} 8192 {16384, 24576, 32768, 49152} random (e.g., partial Hadamard, random signs,...) concatenations of dictionaries (e.g., [Haar, ID, RST],...) columns normalized 2 or 3 vectors x per matrix such that each resulting (L1) instance (with b := Ax ) has unique optimum x A. Tillmann 11 / 22
15 Comparison of l 1 -Solvers Constructing Unique Solutions Testset Construction and Computational Results 274 instances with known, unique solution vectors x : For each matrix A, choose support S which obeys 1 pick S at random, and ERC(A, S) := max A S a j 1 < 1. j / S 2 try increasing some S by repeatedly adding the resp. arg max 3 For dense A's, use L1TestPack to construct another unique solution support (via optimality condition for (L1)) Entries of x S random with high dynamic range A. Tillmann 12 / 22
16 Comparison Setup Comparison of l 1 -Solvers Testset Construction and Computational Results Only exact solvers for (L1): min x 1 s. t. Ax = b Tested algorithms: ISAL1, SPGL1, YALL1, l 1 -Magic, SolveBP (SparseLab), l 1 -Homotopy, CPLEX (Dual Simplex) Use default settings (black box usage) Solution x optimal, if x x 2 Solution x acceptable, if x x A. Tillmann 13 / 22
17 Comparison of l 1 -Solvers Testset Construction and Computational Results Running Time vs. Distance from Unique Optimum ISAL1 (w/o HOC) SPGL1 YALL1 CPLEX l1 MAGIC SparseLab / PDCO Homotopy x x Running Times [sec] A. Tillmann 14 / 22
18 Comparison of l 1 -Solvers Testset Construction and Computational Results Results: Average Performances CPLEX (Dual Simplex): most reliable solver SPGL1: apparently very fast with usually acceptable solutions ISAL1: mostly very accurate, but rather slow SolveBP: often produces acceptable solutions, but slow l 1 -Magic: fast for sparse A, but results often inacceptable YALL1: very fast, but results largely inacceptable l 1 -Homotopy: usually pretty accurate (not always acceptable), but on the slow side Can we achieve better performances without changing default (tolerance) settings? A. Tillmann 15 / 22
19 Comparison of l 1 -Solvers Improvements with Heuristic Optimality Check New Optimality Check for l 1 -Minimization Optimality criterion for (L1): x arg min x 1 x 1 Im(A T ) x: Ax=b Exact evalution too expensive A. Tillmann 16 / 22
20 Comparison of l 1 -Solvers Improvements with Heuristic Optimality Check New Optimality Check for l 1 -Minimization Optimality criterion for (L1): x arg min x 1 x 1 Im(A T ) x: Ax=b Exact evalution too expensive Heuristic Optimality Check (HOC): Estimate support S of given x and (approximately) solve A T S w = sign(x S). If w is dual-feasible ( A T w 1), compute x from A S x S = b. If x is primal-feasible, it is optimal if b T w = x 1. Allows safe jumping to optimum (from infeasible points) A. Tillmann 16 / 22
21 Comparison of l 1 -Solvers Improvements with Heuristic Optimality Check Impact of Heuristic Optimality Check (Example) HOC in ISAL1 HOC in SPGL1 HOC in l 1 -Magic Iteration k Iteration k Iteration k (red curves: distance to known optimum, blue curves: feasibility violation) A. Tillmann 17 / 22
22 Comparison of l 1 -Solvers Improvements with HOC (numbers) Improvements with Heuristic Optimality Check ( instances) ISAL1 SPGL1 YALL1 l 1 -Mag. l 1 -Hom. solved faster w/ HOC 184/212 /0 /0 /0 161/181 solved only w/ HOC improved (ERC-based) 99.5% 88.0% 6.5% 78.0% 91.0% improved (other part) 36.5% 20.3% 0.0% 18.9% 74.3% A. Tillmann 18 / 22
23 Comparison of l 1 -Solvers Improvements with HOC (numbers) Improvements with Heuristic Optimality Check ( instances) ISAL1 SPGL1 YALL1 l 1 -Mag. l 1 -Hom. solved faster w/ HOC 184/212 /0 /0 /0 161/181 solved only w/ HOC improved (ERC-based) 99.5% 88.0% 6.5% 78.0% 91.0% improved (other part) 36.5% 20.3% 0.0% 18.9% 74.3% Explanation for higher HOC success rate on ERC-based testset: ERC = w = (A T S ) sign(x S ) satises A T w x 1. A. Tillmann 18 / 22
24 Comparison of l 1 -Solvers Improvements with HOC (pictures) Improvements with Heuristic Optimality Check without HOC: ISAL1 SPGL1 YALL1 l 1 -Magic l 1 -Hom. x x 2 x x 2 x x 2 x x 2 x x Running Times [sec] Running Times [sec] Running Times [sec] Running Times [sec] Running Times [sec] with HOC: ISAL1 SPGL1 YALL1 l 1 -Magic l 1 -Hom. x x 2 x x 2 x x 2 x x 2 x x Running Times [sec] Running Times [sec] Running Times [sec] Running Times [sec] Running Times [sec] A. Tillmann 19 / 22
25 Comparison of l 1 -Solvers Rehabilitation of l 1 -Homotopy Improvements with Heuristic Optimality Check The Homotopy method provably solves (L1) via a sequence of problems min 1 2 Ax b 2 + λ x 1 for parameters λ 0 decreasing to zero. Also: Provably fast for su. sparse solutions. not fast, inaccurate?! x x Running Times [sec] A. Tillmann 20 / 22
26 Comparison of l 1 -Solvers Rehabilitation of l 1 -Homotopy Improvements with Heuristic Optimality Check The Homotopy method provably solves (L1) via a sequence of problems min 1 2 Ax b 2 + λ x 1 for parameters λ 0 decreasing to zero. Also: Provably fast for su. sparse solutions. nal λ = 0 nal λ = 10 9 x x 2 x x Running Times [sec] Numerical issues? Running Times [sec] Winner! A. Tillmann 20 / 22
27 Possible Future Research Possible Future Research ISAL1 with variable target values? HOC helpful in Approximate Homotopy Path algorithm? Extensions to Denoising problems? PF ε for, e.g., F = { x Ax b 2 δ }? HOC schemes? Testsets, solver comparisons (also for implicit matrices),...? Really hard instances?... e.g., construction of Mairal & Yu for which the (exact) Homotopy path has O(3 n ) kinks? A. Tillmann 21 / 22
28 SPEAR Project ISA theory: Lorenz, Pfetsch & T.: An Infeasible-Point Subgradient Method Using Adaptive Approximate Projections, 2011 ISAL1 theory, HOC & numerical results: Lorenz, Pfetsch & T.: Solving Basis Pursuit: Subgradient Algorithm, Heuristic Optimality Check, and Solver Comparison, 2011/2012 Papers, Matlab Codes (ISAL1, HOC, L1TestPack), Testset, Slides, Posters etc. available at: A. Tillmann 22 / 22
Ellipsoid Algorithm :Algorithms in the Real World. Ellipsoid Algorithm. Reduction from general case
Ellipsoid Algorithm 15-853:Algorithms in the Real World Linear and Integer Programming II Ellipsoid algorithm Interior point methods First polynomial-time algorithm for linear programming (Khachian 79)
More informationIntroduction to Optimization
Introduction to Optimization Second Order Optimization Methods Marc Toussaint U Stuttgart Planned Outline Gradient-based optimization (1st order methods) plain grad., steepest descent, conjugate grad.,
More informationLecture 17 Sparse Convex Optimization
Lecture 17 Sparse Convex Optimization Compressed sensing A short introduction to Compressed Sensing An imaging perspective 10 Mega Pixels Scene Image compression Picture Why do we compress images? Introduction
More informationCrash-Starting the Simplex Method
Crash-Starting the Simplex Method Ivet Galabova Julian Hall School of Mathematics, University of Edinburgh Optimization Methods and Software December 2017 Ivet Galabova, Julian Hall Crash-Starting Simplex
More informationCollaborative Sparsity and Compressive MRI
Modeling and Computation Seminar February 14, 2013 Table of Contents 1 T2 Estimation 2 Undersampling in MRI 3 Compressed Sensing 4 Model-Based Approach 5 From L1 to L0 6 Spatially Adaptive Sparsity MRI
More information1. Introduction. performance of numerical methods. complexity bounds. structural convex optimization. course goals and topics
1. Introduction EE 546, Univ of Washington, Spring 2016 performance of numerical methods complexity bounds structural convex optimization course goals and topics 1 1 Some course info Welcome to EE 546!
More informationSparse Optimization Lecture: Proximal Operator/Algorithm and Lagrange Dual
Sparse Optimization Lecture: Proximal Operator/Algorithm and Lagrange Dual Instructor: Wotao Yin July 2013 online discussions on piazza.com Those who complete this lecture will know learn the proximal
More informationThe Alternating Direction Method of Multipliers
The Alternating Direction Method of Multipliers With Adaptive Step Size Selection Peter Sutor, Jr. Project Advisor: Professor Tom Goldstein October 8, 2015 1 / 30 Introduction Presentation Outline 1 Convex
More informationDantzig selector homotopy with dynamic measurements
Dantzig selector homotopy with dynamic measurements M. Salman Asif a and Justin Romberg a a School of Electrical and Computer Engineering Georgia Institute of Technology, Atlanta, GA, 30332 ABSTRACT The
More informationELEG Compressive Sensing and Sparse Signal Representations
ELEG 867 - Compressive Sensing and Sparse Signal Representations Gonzalo R. Arce Depart. of Electrical and Computer Engineering University of Delaware Fall 211 Compressive Sensing G. Arce Fall, 211 1 /
More informationSection 5 Convex Optimisation 1. W. Dai (IC) EE4.66 Data Proc. Convex Optimisation page 5-1
Section 5 Convex Optimisation 1 W. Dai (IC) EE4.66 Data Proc. Convex Optimisation 1 2018 page 5-1 Convex Combination Denition 5.1 A convex combination is a linear combination of points where all coecients
More informationMVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg
MVE165/MMG630, Integer linear programming algorithms Ann-Brith Strömberg 2009 04 15 Methods for ILP: Overview (Ch. 14.1) Enumeration Implicit enumeration: Branch and bound Relaxations Decomposition methods:
More informationSelected Topics in Column Generation
Selected Topics in Column Generation February 1, 2007 Choosing a solver for the Master Solve in the dual space(kelly s method) by applying a cutting plane algorithm In the bundle method(lemarechal), a
More informationConvex Optimization MLSS 2015
Convex Optimization MLSS 2015 Constantine Caramanis The University of Texas at Austin The Optimization Problem minimize : f (x) subject to : x X. The Optimization Problem minimize : f (x) subject to :
More informationInvestigating Mixed-Integer Hulls using a MIP-Solver
Investigating Mixed-Integer Hulls using a MIP-Solver Matthias Walter Otto-von-Guericke Universität Magdeburg Joint work with Volker Kaibel (OvGU) Aussois Combinatorial Optimization Workshop 2015 Outline
More informationConvexization in Markov Chain Monte Carlo
in Markov Chain Monte Carlo 1 IBM T. J. Watson Yorktown Heights, NY 2 Department of Aerospace Engineering Technion, Israel August 23, 2011 Problem Statement MCMC processes in general are governed by non
More informationDS Machine Learning and Data Mining I. Alina Oprea Associate Professor, CCIS Northeastern University
DS 4400 Machine Learning and Data Mining I Alina Oprea Associate Professor, CCIS Northeastern University September 20 2018 Review Solution for multiple linear regression can be computed in closed form
More informationIntroduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs
Introduction to Mathematical Programming IE496 Final Review Dr. Ted Ralphs IE496 Final Review 1 Course Wrap-up: Chapter 2 In the introduction, we discussed the general framework of mathematical modeling
More information15. Cutting plane and ellipsoid methods
EE 546, Univ of Washington, Spring 2012 15. Cutting plane and ellipsoid methods localization methods cutting-plane oracle examples of cutting plane methods ellipsoid method convergence proof inequality
More informationLecture 19 Subgradient Methods. November 5, 2008
Subgradient Methods November 5, 2008 Outline Lecture 19 Subgradients and Level Sets Subgradient Method Convergence and Convergence Rate Convex Optimization 1 Subgradients and Level Sets A vector s is a
More informationRevisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization. Author: Martin Jaggi Presenter: Zhongxing Peng
Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization Author: Martin Jaggi Presenter: Zhongxing Peng Outline 1. Theoretical Results 2. Applications Outline 1. Theoretical Results 2. Applications
More informationIterative Algorithms I: Elementary Iterative Methods and the Conjugate Gradient Algorithms
Iterative Algorithms I: Elementary Iterative Methods and the Conjugate Gradient Algorithms By:- Nitin Kamra Indian Institute of Technology, Delhi Advisor:- Prof. Ulrich Reude 1. Introduction to Linear
More informationAdvanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs
Advanced Operations Research Techniques IE316 Quiz 2 Review Dr. Ted Ralphs IE316 Quiz 2 Review 1 Reading for The Quiz Material covered in detail in lecture Bertsimas 4.1-4.5, 4.8, 5.1-5.5, 6.1-6.3 Material
More information3 Interior Point Method
3 Interior Point Method Linear programming (LP) is one of the most useful mathematical techniques. Recent advances in computer technology and algorithms have improved computational speed by several orders
More informationInteger Programming Theory
Integer Programming Theory Laura Galli October 24, 2016 In the following we assume all functions are linear, hence we often drop the term linear. In discrete optimization, we seek to find a solution x
More informationProgramming, numerics and optimization
Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June
More informationInter and Intra-Modal Deformable Registration:
Inter and Intra-Modal Deformable Registration: Continuous Deformations Meet Efficient Optimal Linear Programming Ben Glocker 1,2, Nikos Komodakis 1,3, Nikos Paragios 1, Georgios Tziritas 3, Nassir Navab
More informationLecture 19: Convex Non-Smooth Optimization. April 2, 2007
: Convex Non-Smooth Optimization April 2, 2007 Outline Lecture 19 Convex non-smooth problems Examples Subgradients and subdifferentials Subgradient properties Operations with subgradients and subdifferentials
More informationCombinatorial Selection and Least Absolute Shrinkage via The CLASH Operator
Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator Volkan Cevher Laboratory for Information and Inference Systems LIONS / EPFL http://lions.epfl.ch & Idiap Research Institute joint
More informationCivil Engineering Systems Analysis Lecture XIV. Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics
Civil Engineering Systems Analysis Lecture XIV Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics Today s Learning Objectives Dual 2 Linear Programming Dual Problem 3
More informationA Generic Separation Algorithm and Its Application to the Vehicle Routing Problem
A Generic Separation Algorithm and Its Application to the Vehicle Routing Problem Presented by: Ted Ralphs Joint work with: Leo Kopman Les Trotter Bill Pulleyblank 1 Outline of Talk Introduction Description
More informationA MRF Shape Prior for Facade Parsing with Occlusions Supplementary Material
A MRF Shape Prior for Facade Parsing with Occlusions Supplementary Material Mateusz Koziński, Raghudeep Gadde, Sergey Zagoruyko, Guillaume Obozinski and Renaud Marlet Université Paris-Est, LIGM (UMR CNRS
More informationCode Generation for Embedded Convex Optimization
Code Generation for Embedded Convex Optimization Jacob Mattingley Stanford University October 2010 Convex optimization Problems solvable reliably and efficiently Widely used in scheduling, finance, engineering
More informationIteratively Re-weighted Least Squares for Sums of Convex Functions
Iteratively Re-weighted Least Squares for Sums of Convex Functions James Burke University of Washington Jiashan Wang LinkedIn Frank Curtis Lehigh University Hao Wang Shanghai Tech University Daiwei He
More informationComposite Self-concordant Minimization
Composite Self-concordant Minimization Volkan Cevher Laboratory for Information and Inference Systems-LIONS Ecole Polytechnique Federale de Lausanne (EPFL) volkan.cevher@epfl.ch Paris 6 Dec 11, 2013 joint
More informationLinear Programming. Linear Programming. Linear Programming. Example: Profit Maximization (1/4) Iris Hui-Ru Jiang Fall Linear programming
Linear Programming 3 describes a broad class of optimization tasks in which both the optimization criterion and the constraints are linear functions. Linear Programming consists of three parts: A set of
More informationSection Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017
Section Notes 5 Review of Linear Programming Applied Math / Engineering Sciences 121 Week of October 15, 2017 The following list of topics is an overview of the material that was covered in the lectures
More informationLecture 12: Feasible direction methods
Lecture 12 Lecture 12: Feasible direction methods Kin Cheong Sou December 2, 2013 TMA947 Lecture 12 Lecture 12: Feasible direction methods 1 / 1 Feasible-direction methods, I Intro Consider the problem
More informationModified Iterative Method for Recovery of Sparse Multiple Measurement Problems
Journal of Electrical Engineering 6 (2018) 124-128 doi: 10.17265/2328-2223/2018.02.009 D DAVID PUBLISHING Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems Sina Mortazavi and
More informationA Patch Prior for Dense 3D Reconstruction in Man-Made Environments
A Patch Prior for Dense 3D Reconstruction in Man-Made Environments Christian Häne 1, Christopher Zach 2, Bernhard Zeisl 1, Marc Pollefeys 1 1 ETH Zürich 2 MSR Cambridge October 14, 2012 A Patch Prior for
More informationLagrangean relaxation - exercises
Lagrangean relaxation - exercises Giovanni Righini Set covering We start from the following Set Covering Problem instance: min z = x + 2x 2 + x + 2x 4 + x 5 x + x 2 + x 4 x 2 + x x 2 + x 4 + x 5 x + x
More informationPrivately Solving Linear Programs
Privately Solving Linear Programs Justin Hsu 1 Aaron Roth 1 Tim Roughgarden 2 Jonathan Ullman 3 1 University of Pennsylvania 2 Stanford University 3 Harvard University July 8th, 2014 A motivating example
More informationEARLY INTERIOR-POINT METHODS
C H A P T E R 3 EARLY INTERIOR-POINT METHODS An interior-point algorithm is one that improves a feasible interior solution point of the linear program by steps through the interior, rather than one that
More informationResearch Interests Optimization:
Mitchell: Research interests 1 Research Interests Optimization: looking for the best solution from among a number of candidates. Prototypical optimization problem: min f(x) subject to g(x) 0 x X IR n Here,
More informationMatrix-free IPM with GPU acceleration
Matrix-free IPM with GPU acceleration Julian Hall, Edmund Smith and Jacek Gondzio School of Mathematics University of Edinburgh jajhall@ed.ac.uk 29th June 2011 Linear programming theory Primal-dual pair
More informationFace Recognition via Sparse Representation
Face Recognition via Sparse Representation John Wright, Allen Y. Yang, Arvind, S. Shankar Sastry and Yi Ma IEEE Trans. PAMI, March 2008 Research About Face Face Detection Face Alignment Face Recognition
More informationAN IMPROVED ITERATIVE METHOD FOR SOLVING GENERAL SYSTEM OF EQUATIONS VIA GENETIC ALGORITHMS
AN IMPROVED ITERATIVE METHOD FOR SOLVING GENERAL SYSTEM OF EQUATIONS VIA GENETIC ALGORITHMS Seyed Abolfazl Shahzadehfazeli 1, Zainab Haji Abootorabi,3 1 Parallel Processing Laboratory, Yazd University,
More informationLinear Optimization. Andongwisye John. November 17, Linkoping University. Andongwisye John (Linkoping University) November 17, / 25
Linear Optimization Andongwisye John Linkoping University November 17, 2016 Andongwisye John (Linkoping University) November 17, 2016 1 / 25 Overview 1 Egdes, One-Dimensional Faces, Adjacency of Extreme
More informationSparsity and image processing
Sparsity and image processing Aurélie Boisbunon INRIA-SAM, AYIN March 6, Why sparsity? Main advantages Dimensionality reduction Fast computation Better interpretability Image processing pattern recognition
More informationOptimal Simplification of Building Ground Plans
XXI ISPRS Congress 7 July 2008 Beijing, China Optimal Simplification of Building Ground Plans Jan-Henrik Haunert Institute of Cartography and Geoinformatics Leibniz Universität Hannover Germany Alexander
More informationHow to use Farkas lemma to say something important about linear infeasible problems MOSEK Technical report: TR
How to use Farkas lemma to say something important about linear infeasible problems MOSEK Technical report: TR-2011-1. Erling D. Andersen 12-September-2011 Abstract When formulating a linear optimization
More information11 Linear Programming
11 Linear Programming 11.1 Definition and Importance The final topic in this course is Linear Programming. We say that a problem is an instance of linear programming when it can be effectively expressed
More informationExact solutions to mixed-integer linear programming problems
Exact solutions to mixed-integer linear programming problems Dan Steffy Zuse Institute Berlin and Oakland University Joint work with Bill Cook, Thorsten Koch and Kati Wolter November 18, 2011 Mixed-Integer
More information16.410/413 Principles of Autonomy and Decision Making
16.410/413 Principles of Autonomy and Decision Making Lecture 17: The Simplex Method Emilio Frazzoli Aeronautics and Astronautics Massachusetts Institute of Technology November 10, 2010 Frazzoli (MIT)
More informationTemplates for Convex Cone Problems with Applications to Sparse Signal Recovery
Mathematical Programming Computation manuscript No. (will be inserted by the editor) Templates for Convex Cone Problems with Applications to Sparse Signal Recovery Stephen R. Becer Emmanuel J. Candès Michael
More informationImproved Gomory Cuts for Primal Cutting Plane Algorithms
Improved Gomory Cuts for Primal Cutting Plane Algorithms S. Dey J-P. Richard Industrial Engineering Purdue University INFORMS, 2005 Outline 1 Motivation The Basic Idea Set up the Lifting Problem How to
More informationLecture 19: November 5
0-725/36-725: Convex Optimization Fall 205 Lecturer: Ryan Tibshirani Lecture 9: November 5 Scribes: Hyun Ah Song Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes have not
More informationDISTRIBUTED ALGORITHMS FOR BASIS PURSUIT. Institute for Systems and Robotics / IST, Lisboa, Portugal
DISTRIBUTED ALGORITHMS FOR BASIS PURSUIT João F C Mota, João M F Xavier, Pedro M Q Aguiar, Markus Püschel Institute for Systems and Robotics / IST, Lisboa, Portugal {jmota,jxavier,aguiar}@isristutlpt Department
More informationConic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding
Conic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding B. O Donoghue E. Chu N. Parikh S. Boyd Convex Optimization and Beyond, Edinburgh, 11/6/2104 1 Outline Cone programming Homogeneous
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 20: Sparse Linear Systems; Direct Methods vs. Iterative Methods Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 26
More informationParallel and Distributed Graph Cuts by Dual Decomposition
Parallel and Distributed Graph Cuts by Dual Decomposition Presented by Varun Gulshan 06 July, 2010 What is Dual Decomposition? What is Dual Decomposition? It is a technique for optimization: usually used
More informationConvex optimization algorithms for sparse and low-rank representations
Convex optimization algorithms for sparse and low-rank representations Lieven Vandenberghe, Hsiao-Han Chao (UCLA) ECC 2013 Tutorial Session Sparse and low-rank representation methods in control, estimation,
More informationA fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009]
A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009] Yongjia Song University of Wisconsin-Madison April 22, 2010 Yongjia Song
More informationCMU-Q Lecture 9: Optimization II: Constrained,Unconstrained Optimization Convex optimization. Teacher: Gianni A. Di Caro
CMU-Q 15-381 Lecture 9: Optimization II: Constrained,Unconstrained Optimization Convex optimization Teacher: Gianni A. Di Caro GLOBAL FUNCTION OPTIMIZATION Find the global maximum of the function f x (and
More informationPrograms. Introduction
16 Interior Point I: Linear Programs Lab Objective: For decades after its invention, the Simplex algorithm was the only competitive method for linear programming. The past 30 years, however, have seen
More informationReport of Linear Solver Implementation on GPU
Report of Linear Solver Implementation on GPU XIANG LI Abstract As the development of technology and the linear equation solver is used in many aspects such as smart grid, aviation and chemical engineering,
More informationMathematical and Algorithmic Foundations Linear Programming and Matchings
Adavnced Algorithms Lectures Mathematical and Algorithmic Foundations Linear Programming and Matchings Paul G. Spirakis Department of Computer Science University of Patras and Liverpool Paul G. Spirakis
More informationBilevel Sparse Coding
Adobe Research 345 Park Ave, San Jose, CA Mar 15, 2013 Outline 1 2 The learning model The learning algorithm 3 4 Sparse Modeling Many types of sensory data, e.g., images and audio, are in high-dimensional
More informationA primal-dual framework for mixtures of regularizers
A primal-dual framework for mixtures of regularizers Baran Gözcü baran.goezcue@epfl.ch Laboratory for Information and Inference Systems (LIONS) École Polytechnique Fédérale de Lausanne (EPFL) Switzerland
More informationSurrogate Gradient Algorithm for Lagrangian Relaxation 1,2
Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2 X. Zhao 3, P. B. Luh 4, and J. Wang 5 Communicated by W.B. Gong and D. D. Yao 1 This paper is dedicated to Professor Yu-Chi Ho for his 65th birthday.
More informationContents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.
page v Preface xiii I Basics 1 1 Optimization Models 3 1.1 Introduction... 3 1.2 Optimization: An Informal Introduction... 4 1.3 Linear Equations... 7 1.4 Linear Optimization... 10 Exercises... 12 1.5
More informationBenders in a nutshell Matteo Fischetti, University of Padova
Benders in a nutshell Matteo Fischetti, University of Padova ODS 2017, Sorrento, September 2017 1 Benders decomposition The original Benders decomposition from the 1960s uses two distinct ingredients for
More informationCharacterizing Improving Directions Unconstrained Optimization
Final Review IE417 In the Beginning... In the beginning, Weierstrass's theorem said that a continuous function achieves a minimum on a compact set. Using this, we showed that for a convex set S and y not
More informationNumerical schemes for Hamilton-Jacobi equations, control problems and games
Numerical schemes for Hamilton-Jacobi equations, control problems and games M. Falcone H. Zidani SADCO Spring School Applied and Numerical Optimal Control April 23-27, 2012, Paris Lecture 2/3 M. Falcone
More informationHyperspectral Data Classification via Sparse Representation in Homotopy
Hyperspectral Data Classification via Sparse Representation in Homotopy Qazi Sami ul Haq,Lixin Shi,Linmi Tao,Shiqiang Yang Key Laboratory of Pervasive Computing, Ministry of Education Department of Computer
More informationSignal Reconstruction from Sparse Representations: An Introdu. Sensing
Signal Reconstruction from Sparse Representations: An Introduction to Compressed Sensing December 18, 2009 Digital Data Acquisition Suppose we want to acquire some real world signal digitally. Applications
More informationObtaining a Feasible Geometric Programming Primal Solution, Given a Near-Optimal Dual Solution
Obtaining a Feasible Geometric Programming Primal Solution, Given a ear-optimal Dual Solution DEIS L. BRICKER and JAE CHUL CHOI Dept. of Industrial Engineering The University of Iowa June 1994 A dual algorithm
More informationIntroduction to Optimization Problems and Methods
Introduction to Optimization Problems and Methods wjch@umich.edu December 10, 2009 Outline 1 Linear Optimization Problem Simplex Method 2 3 Cutting Plane Method 4 Discrete Dynamic Programming Problem Simplex
More informationDALM-SVD: ACCELERATED SPARSE CODING THROUGH SINGULAR VALUE DECOMPOSITION OF THE DICTIONARY
DALM-SVD: ACCELERATED SPARSE CODING THROUGH SINGULAR VALUE DECOMPOSITION OF THE DICTIONARY Hugo Gonçalves 1, Miguel Correia Xin Li 1 Aswin Sankaranarayanan 1 Vitor Tavares Carnegie Mellon University 1
More informationVariable Selection 6.783, Biomedical Decision Support
6.783, Biomedical Decision Support (lrosasco@mit.edu) Department of Brain and Cognitive Science- MIT November 2, 2009 About this class Why selecting variables Approaches to variable selection Sparsity-based
More informationPart 5: Structured Support Vector Machines
Part 5: Structured Support Vector Machines Sebastian Nowozin and Christoph H. Lampert Providence, 21st June 2012 1 / 34 Problem (Loss-Minimizing Parameter Learning) Let d(x, y) be the (unknown) true data
More informationTemplates for convex cone problems with applications to sparse signal recovery
Math. Prog. Comp. (2011) 3:165 218 DOI 10.1007/s12532-011-0029-5 FULL LENGTH PAPER Templates for convex cone problems with applications to sparse signal recovery Stephen R. Becker Emmanuel J. Candès Michael
More informationFast-Lipschitz Optimization
Fast-Lipschitz Optimization DREAM Seminar Series University of California at Berkeley September 11, 2012 Carlo Fischione ACCESS Linnaeus Center, Electrical Engineering KTH Royal Institute of Technology
More informationApproximation Algorithms
Approximation Algorithms Group Members: 1. Geng Xue (A0095628R) 2. Cai Jingli (A0095623B) 3. Xing Zhe (A0095644W) 4. Zhu Xiaolu (A0109657W) 5. Wang Zixiao (A0095670X) 6. Jiao Qing (A0095637R) 7. Zhang
More informationSymbolic Subdifferentiation in Python
Symbolic Subdifferentiation in Python Maurizio Caló and Jaehyun Park EE 364B Project Final Report Stanford University, Spring 2010-11 June 2, 2011 1 Introduction 1.1 Subgradient-PY We implemented a Python
More informationSupport Vector Machines.
Support Vector Machines srihari@buffalo.edu SVM Discussion Overview 1. Overview of SVMs 2. Margin Geometry 3. SVM Optimization 4. Overlapping Distributions 5. Relationship to Logistic Regression 6. Dealing
More informationComparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles
INTERNATIONAL JOURNAL OF MATHEMATICS MODELS AND METHODS IN APPLIED SCIENCES Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles M. Fernanda P.
More information25. NLP algorithms. ˆ Overview. ˆ Local methods. ˆ Constrained optimization. ˆ Global methods. ˆ Black-box methods.
CS/ECE/ISyE 524 Introduction to Optimization Spring 2017 18 25. NLP algorithms ˆ Overview ˆ Local methods ˆ Constrained optimization ˆ Global methods ˆ Black-box methods ˆ Course wrap-up Laurent Lessard
More informationParallel and Distributed Sparse Optimization Algorithms
Parallel and Distributed Sparse Optimization Algorithms Part I Ruoyu Li 1 1 Department of Computer Science and Engineering University of Texas at Arlington March 19, 2015 Ruoyu Li (UTA) Parallel and Distributed
More informationOn the regularizing power of multigrid-type algorithms
On the regularizing power of multigrid-type algorithms Marco Donatelli Dipartimento di Fisica e Matematica Università dell Insubria www.mat.unimi.it/user/donatel Outline 1 Restoration of blurred and noisy
More informationCombinatorial meets Convex Leveraging combinatorial structure to obtain faster algorithms, provably
Combinatorial meets Convex Leveraging combinatorial structure to obtain faster algorithms, provably Lorenzo Orecchia Department of Computer Science 6/30/15 Challenges in Optimization for Data Science 1
More informationData-driven modeling: A low-rank approximation problem
1 / 34 Data-driven modeling: A low-rank approximation problem Ivan Markovsky Vrije Universiteit Brussel 2 / 34 Outline Setup: data-driven modeling Problems: system identification, machine learning,...
More informationFundamentals of Integer Programming
Fundamentals of Integer Programming Di Yuan Department of Information Technology, Uppsala University January 2018 Outline Definition of integer programming Formulating some classical problems with integer
More informationRECONSTRUCTION ALGORITHMS FOR COMPRESSIVE VIDEO SENSING USING BASIS PURSUIT
RECONSTRUCTION ALGORITHMS FOR COMPRESSIVE VIDEO SENSING USING BASIS PURSUIT Ida Wahidah 1, Andriyan Bayu Suksmono 1 1 School of Electrical Engineering and Informatics, Institut Teknologi Bandung Jl. Ganesa
More information(Sparse) Linear Solvers
(Sparse) Linear Solvers Ax = B Why? Many geometry processing applications boil down to: solve one or more linear systems Parameterization Editing Reconstruction Fairing Morphing 2 Don t you just invert
More informationA parallel direct/iterative solver based on a Schur complement approach
A parallel direct/iterative solver based on a Schur complement approach Gene around the world at CERFACS Jérémie Gaidamour LaBRI and INRIA Bordeaux - Sud-Ouest (ScAlApplix project) February 29th, 2008
More informationPenalty Alternating Direction Methods for Mixed- Integer Optimization: A New View on Feasibility Pumps
Penalty Alternating Direction Methods for Mixed- Integer Optimization: A New View on Feasibility Pumps Björn Geißler, Antonio Morsi, Lars Schewe, Martin Schmidt FAU Erlangen-Nürnberg, Discrete Optimization
More informationRecent Developments in Model-based Derivative-free Optimization
Recent Developments in Model-based Derivative-free Optimization Seppo Pulkkinen April 23, 2010 Introduction Problem definition The problem we are considering is a nonlinear optimization problem with constraints:
More informationBenders Decomposition
Benders Decomposition Using projections to solve problems thst@man.dtu.dk DTU-Management Technical University of Denmark 1 Outline Introduction Using projections Benders decomposition Simple plant location
More informationCyber Security Analysis of State Estimators in Electric Power Systems
Cyber Security Analysis of State Estimators in Electric Power Systems H. Sandberg, G. Dán, A. Teixeira, K. C. Sou, O. Vukovic, K. H. Johansson ACCESS Linnaeus Center KTH Royal Institute of Technology,
More information