PIPA: A New Proximal Interior Point Algorithm for Large-Scale Convex Optimization

Size: px
Start display at page:

Download "PIPA: A New Proximal Interior Point Algorithm for Large-Scale Convex Optimization"

Transcription

1 PIPA: A New Proximal Interior Point Algorithm for Large-Scale Convex Optimization Marie-Caroline Corbineau 1, Emilie Chouzenoux 1,2, Jean-Christophe Pesquet 1 1 CVN, CentraleSupélec, Université Paris-Saclay, France 2 LIGM, UMR CNRS 8049, Université Paris-Est Marne la Vallée, France 19 April 2018 Calgary, ICASSP 2018 Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

2 In collaboration with E. Chouzenoux J.-C. Pesquet Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

3 About IPMs Interior Point Methods Many problems in signal/image processing (image restoration, enhancement, denoising/deblurring, spectral unmixing) can be formulated as constrained minimization problems need efficient methods for solving those. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

4 About IPMs Interior Point Methods Many problems in signal/image processing (image restoration, enhancement, denoising/deblurring, spectral unmixing) can be formulated as constrained minimization problems need efficient methods for solving those. Constrained Problem P 0 : minimize f (x) x R n s.t. ( i {1,..., p}) c i (x) 0 where f : R n ], + ] convex ( i {1,..., p}) c i : R n ], + ] convex, smooth Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

5 About IPMs Interior Point Methods Many problems in signal/image processing (image restoration, enhancement, denoising/deblurring, spectral unmixing) can be formulated as constrained minimization problems need efficient methods for solving those. Constrained Problem P 0 : minimize f (x) x R n s.t. ( i {1,..., p}) c i (x) 0 where f : R n ], + ] convex ( i {1,..., p}) c i : R n ], + ] convex, smooth How to minimize f while ensuring that every iterate is feasible? Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

6 About IPMs Interior Point Methods Many problems in signal/image processing (image restoration, enhancement, denoising/deblurring, spectral unmixing) can be formulated as constrained minimization problems need efficient methods for solving those. Constrained Problem P 0 : minimize f (x) x R n s.t. ( i {1,..., p}) c i (x) 0 where f : R n ], + ] convex ( i {1,..., p}) c i : R n ], + ] convex, smooth How to minimize f while ensuring that every iterate is feasible? Add a barrier function Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

7 About IPMs Logarithmic Barrier Constrained Problem P 0 : minimize f (x) x R n s.t. ( i {1,..., p}) c i (x) 0 Unconstrained Subproblem P µ : Where µ > 0 is the barrier parameter. minimize f (x) µ x R n p ln ( c i (x)) i=1 }{{} + as c i (x) 0 P 0 is replaced by a sequence of subproblems (P µj ) j N. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

8 About IPMs Logarithmic Barrier Constrained Problem P 0 : minimize f (x) x R n s.t. ( i {1,..., p}) c i (x) 0 Unconstrained Subproblem P µ : Where µ > 0 is the barrier parameter. minimize f (x) µ x R n p ln ( c i (x)) i=1 }{{} + as c i (x) 0 P 0 is replaced by a sequence of subproblems (P µj ) j N. Subproblems are solved approximately for a sequence µ j 0. Main advantage : every iterate is feasible. Primal-dual algorithm : superlinear convergence for NLP. [Gould et al., 2001] Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

9 About IPMs Logarithmic Barrier Constrained Problem P 0 : minimize f (x) x R n s.t. ( i {1,..., p}) c i (x) 0 Unconstrained Subproblem P µ : Where µ > 0 is the barrier parameter. minimize f (x) µ x R n p ln ( c i (x)) i=1 }{{} + as c i (x) 0 P 0 is replaced by a sequence of subproblems (P µj ) j N. Subproblems are solved approximately for a sequence µ j 0. Main advantage : every iterate is feasible. Primal-dual algorithm : superlinear convergence for NLP. [Gould et al., 2001] Require the inversion of an n n matrix at each step : medium size applications. First or second order methods : limited to smooth functions. [Armand et al., 2000] Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

10 About IPMs Problem of Interest Quality of the solution and robustness against noise can be improved by adding a non-differentiable term (l 1, TV,... ). Composite Constrained Problem P 0 : minimize f (x) + g(x) x R n s.t. ( i {1,..., p}) c i (x) 0 where f : R n ], + ] convex, non-differentiable g : R n ], + ] convex, smooth ( i {1,..., p}) c i : R n ], + ] convex, smooth Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

11 About IPMs Problem of Interest Quality of the solution and robustness against noise can be improved by adding a non-differentiable term (l 1, TV,... ). Composite Constrained Problem P 0 : minimize f (x) + g(x) x R n s.t. ( i {1,..., p}) c i (x) 0 where f : R n ], + ] convex, non-differentiable g : R n ], + ] convex, smooth ( i {1,..., p}) c i : R n ], + ] convex, smooth How to address the non-smooth term while ensuring that every iterate is feasible? Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

12 About IPMs Problem of Interest Quality of the solution and robustness against noise can be improved by adding a non-differentiable term (l 1, TV,... ). Composite Constrained Problem P 0 : minimize f (x) + g(x) x R n s.t. ( i {1,..., p}) c i (x) 0 where f : R n ], + ] convex, non-differentiable g : R n ], + ] convex, smooth ( i {1,..., p}) c i : R n ], + ] convex, smooth How to address the non-smooth term while ensuring that every iterate is feasible? Combine the logarithmic barrier method with proximal tools. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

13 About IPMs Notation and Definitions Let S + (R N ) be the set of symmetric positive definite matrices of R N N. The weighted norm induced by U S + (R N ) is. U =. U.. Let Γ 0 (R N ) denote the set of proper lower semicontinuous convex functions from R N to ], + ]. Proximity Operator The proximal operator a prox U f (x) of f Γ 0(R N ) at x R N relative to the metric induced by U S + (R N ) is the unique vector ŷ R N such that f (ŷ) ŷ x 2 U = inf f (y) + 1 y x U(y x). y R N 2 a. http ://proximity-operator.net/ Example : Indicator function : projection. l 1 norm : soft-thresholding. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

14 Approach Proposed Approach P 0 is replaced by a sequence of subproblems (P µj ) j N. Unconstrained Subproblems P µj : minimize x R n f (x) + g(x) µ j Our algorithm comprises two interlocked loops. p ln ( c i (x)) i=1 } {{ } smooth Given µ j > 0, (x j,k ) k is produced via several forward-backward (proximal gradient) steps. Once x j,k is close enough to the solution of P µj, the barrier parameter µ j is updated. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

15 Algorithm Iteration Scheme Forward-Backward Step For j fixed, p where ϕ µj (x) = g(x) µ j ln( c i (x)). i=1 x j,k+1 = prox A j,k γ j,k f (x j,k γ j,k A 1 j,k ϕµ j (x j,k)) Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

16 Algorithm Iteration Scheme Forward-Backward Step For j fixed, x j,k+1 = prox A j,k γ j,k f (x j,k γ j,k A 1 j,k ϕµ j (x j,k)) p where ϕ µj (x) = g(x) µ j ln( c i (x)). i=1 Gradient step on the smooth term ; Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

17 Algorithm Iteration Scheme Forward-Backward Step For j fixed, p where ϕ µj (x) = g(x) µ j ln( c i (x)). i=1 x j,k+1 = prox A j,k γ j,k f (x j,k γ j,k A 1 j,k ϕµ j (x j,k)) Gradient step on the smooth term ; Proximal step on the non-differentiable function f ; Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

18 Algorithm Iteration Scheme Forward-Backward Step For j fixed, x j,k+1 = prox A j,k γ j,k f (x j,k γ j,k A 1 j,k ϕµ j (x j,k)) p where ϕ µj (x) = g(x) µ j ln( c i (x)). i=1 Gradient step on the smooth term ; Proximal step on the non-differentiable function f ; Preconditioner A j,k for acceleration [Chouzenoux et al., 2016] ; Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

19 Algorithm Iteration Scheme Forward-Backward Step For j fixed, x j,k+1 = prox A j,k γ j,k f (x j,k γ j,k A 1 j,k ϕµ (x j j,k)) p where ϕ µj (x) = g(x) µ j ln( c i (x)). i=1 Gradient step on the smooth term ; Proximal step on the non-differentiable function f ; Preconditioner A j,k for acceleration [Chouzenoux et al., 2016] ; Stepsize γ j,k > 0 found using a backtracking strategy [Salzo, 2017] since ϕ µj not Lipschitz-differentiable. is Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

20 Algorithm Algorithm Proximal Interior point Algorithm (PIPA) Initialization Let γ > 0, (δ, θ) ]0, 1[ 2, µ 0 > 0; Initialize x 0,0 such that ( i {1,..., p}) c i (x 0,0 ) < 0; For j = 0, 1,... For k = 0, 1,... Choose A j,k satisfying a boundedness condition ; For l = 0, 1,... x j,k l = proxa j,k (x γθ l f j,k γθ l A 1 j,k ϕµ (x j j,k)); Stop if the backtracking condition is met ; x j,k+1 = x j,k l ; γ j,k = γθ l ; Stop if precision conditions are met ; x j+1,0 = x j,k+1 ; Update µ j ; Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

21 Algorithm Theoretical Results Assumptions The set of solutions to P 0 is nonemtpy and bounded ; f, g and the constraints are convex, g is Lipschitz-differentiable and the constraints are continuously twice-differentiable ; The strict interior of the feasible set is nonempty ; ( j N) f + ϕ µj is a Kurdyka-Lojasiewicz (KL) function ; ( j N) (A j,k ) k are bounded from above and from below ; lim j µ j = 0 and ( i {1,..., 4}) lim j ɛ i,j /µ j = 0. Convergence Under some mild technical assumptions : for all j N, (x j,k ) k N converges to a solution to P µj ; (x j,0 ) j N is bounded and every cluster point of it is a solution to P 0 ; if in addition strict complementarity holds, and if there exists i {1,..., p} such that c i is strictly convex (or alternatively, for linear constraints, if some full rank property is satisfied) then (x j,0 ) j N converges to a solution to P 0. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

22 Hyperspectral Unmixing Hyperspectral Unmixing Problem Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

23 Hyperspectral Unmixing Hyperspectral Unmixing Model Optimization Problem p 1 minimize X R p n 2 Y SX κ (WX i ) d 1 i=1 p s.t. ( j {1,..., n}) X i,j 1 i=1 ( i {1,..., p})( j {1,..., n}) X i,j 0 p, n, s : number of endmembers, pixels, spectral bands Y R s n : observation S R s p : library X R p n : abundance matrix W R n n : orthogonal wavelet basis (.) d 1 : l 1 norm of the detail wavelet coefficients Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

24 Hyperspectral Unmixing Experimental Setting Realistic Data Simulation Urban 1 data set : p = 6 endmembers (known spectral signatures), s = 162 spectral bands, n = pixels Gaussian noise : σ 2 = Asphalt Road Grass Tree Roof Metal Dirt 1. http :// Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

25 Hyperspectral Unmixing Experimental Setting Realistic Data Simulation Urban 1 data set : p = 6 endmembers (known spectral signatures), s = 162 spectral bands, n = pixels large-scale pb : > variables. Gaussian noise : σ 2 = Asphalt Road Grass Tree Roof Metal Dirt 1. http :// Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

26 Hyperspectral Unmixing Experimental Setting Realistic Data Simulation Urban 1 data set : p = 6 endmembers (known spectral signatures), s = 162 spectral bands, n = pixels large-scale pb : > variables. Gaussian noise : σ 2 = Reconstruction Model Regularization weight : κ = W : orthogonal Daubechies 4 wavelet decomposition over 2 resolution levels. 1. http :// Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

27 Hyperspectral Unmixing Experimental Setting Realistic Data Simulation Urban 1 data set : p = 6 endmembers (known spectral signatures), s = 162 spectral bands, n = pixels large-scale pb : > variables. Gaussian noise : σ 2 = Reconstruction Model Regularization weight : κ = W : orthogonal Daubechies 4 wavelet decomposition over 2 resolution levels. Algorithm Parameters Variable metric : A j,k := 2 ϕ µj (x j,k ) [Becker et al., 2012]. The barrier parameter (µ j ) j N and the stopping criteria {ɛ i,j /µ j } i {1,...,4} follow a geometric decrease. 1. http :// Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

28 Hyperspectral Unmixing Experimental Setting Realistic Data Simulation Urban 1 data set : p = 6 endmembers (known spectral signatures), s = 162 spectral bands, n = pixels large-scale pb : > variables. Gaussian noise : σ 2 = Reconstruction Model Regularization weight : κ = W : orthogonal Daubechies 4 wavelet decomposition over 2 resolution levels. Algorithm Parameters Variable metric : A j,k := 2 ϕ µj (x j,k ) [Becker et al., 2012]. The barrier parameter (µ j ) j N and the stopping criteria {ɛ i,j /µ j } i {1,...,4} follow a geometric decrease. Implementation Matlab R2016b, Intel Xeon 3.2 GHz processor and 16 GB of RAM. Code will be available soon on 1. http :// Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

29 Hyperspectral Unmixing Comparison State-of-the-Art Algorithms No reg : interior point least squares algorithm without regularization [Chouzenoux et al., 2014] ADMM : alternating direction of multipliers method [Setzer et al., 2010] PDS : primal-dual splitting algorithm [Combettes et al., 2014] GFBS : generalized forward-backward splitting algorithm [Raguet et al., 2013] Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

30 Hyperspectral Unmixing Evaluation Metric Signal-to-Noise Ratio SNR = 10 log 10 ( p i=1 X i X i 2 2 X i 2 2 ) ; SNR i = 10 log 10 ( Xi X i 2 2 X i 2 2 ) where X i is the true abundance map of the i th endmember. Distance from the Iterates to the Solution x j,k x 2 x 2 where x is the vectorization of X and x is obtained after a very large number of iterations. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

31 Results Quantitative Results No reg : SNR = 1.96 db / With regularization : SNR = 3.65 db sec No reg Figure Left : global SNR versus time. Right : distance from the iterates to the solution versus time. Asphalt Road Grass Tree Roof Metal Dirt No reg ADMM PDS GFBS PIPA Table SNR (db) for all endmembers after 13 sec. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

32 Visual Results Asphalt Road Groundtruth No reg ADMM No reg ADMM 6.75 PDS 2.06 GFBS 2.17 PIPA Table SNR (db) PDS GFBS PIPA Figure Abundance map of asphalt road after 13 sec. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

33 Visual Results Grass Groundtruth No reg ADMM No reg ADMM PDS 3.33 GFBS 3.57 PIPA Table SNR (db) PDS GFBS PIPA Figure Abundance map of grass after 13 sec. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

34 Visual Results Dirt Groundtruth No reg ADMM No reg ADMM PDS GFBS PIPA Table SNR (db) PDS GFBS PIPA Figure Abundance map of dirt after 13 sec. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

35 Conclusion Application of a new proximal interior point algorithm to hyperspectral unmixing with a non-differentiable regularization. Convergence guaranteed under mild assumptions. Possibility to include an arbitrary preconditioner. Good performance in the context of a large-scale image recovery application. Extension of the convergence proof to inexact proximity operator. Other applications. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

36 References N. I. M. Gould, D. Orban, A. Sartenaer and P. L. Toint. Superlinear convergence of primal-dual interior point algorithms for nonlinear programming SIAM Journal on Optimization, Vol. 11, No. 4, pp , P. Armand, J. C. Gilbert and S. Jan-Jégou. A feasible BFGS interior point algorithm for solving convex minimization problems SIAM Journal on Optimization, Vol. 11, No. 1, pp , E. Chouzenoux, J.-C. Pesquet and A. Repetti. A block coordinate variable metric forward-backward algorithm Journal of Global Optimization, Vol. 66, No. 3, pp , S. Salzo. The variable metric forward-backward splitting algorithm under mild differentiability assumptions SIAM Journal on Optimization, Vol. 27, No. 4, pp , S. Setzer, G. Steidl and T. Teuber. Deblurring Poissonian images by split Bregman techniques Journal of Visual Communication and Image Representation, Vol. 21, No. 3, pp , E. Chouzenoux, M. Legendre, S. Moussaoui and J. Idier. Fast constrained least squares spectral unmixing using primal-dual interior-point optimization IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol. 7, No. 1, pp 59 69, P. L. Combettes, L. Condat, J.-C. Pesquet and B. C. Vũ. A forward-backward view of some primal-dual optimization methods in image recovery Proceedings of the IEEE International Conference on Image Processing (ICIP 2014), pp , H. Raguet, J. Fadili, G. and Peyré. A generalized forward-backward splitting SIAM Journal on Imaging Sciences, Vol. 6, No. 3, pp , S. Becker and J. Fadili. A forward-backward view of some primal-dual optimization methods in image recovery Proceedings of the 25th Advances in Neural Information Processing Systems Conference (NIPS 2012), pp , Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

37 Thank you! Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 22

38 Stopping Criteria Backtracking [Salzo, 2017] For j and k fixed, the backtracking procedure stops if : ϕ µj ( x l j,k ) ϕµ j (x j,k) x l j,k x j,k ϕ µj (x j,k ) If f := 0, Armijo linesearch along the steepest direction. δ γθ l x l j,k x j,k 2 A j,k Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 4

39 Stopping Criteria Backtracking [Salzo, 2017] For j and k fixed, the backtracking procedure stops if : ϕ µj ( x l j,k ) ϕµ j (x j,k) x l j,k x j,k ϕ µj (x j,k ) If f := 0, Armijo linesearch along the steepest direction. δ γθ l x l j,k x j,k 2 A j,k Accuracy for Solving P µj The barrier parameter is decreased as soon as the following criteria are met : 1 x j,k x j,k+1 ɛ 1,j A γ j,k (x j,k x j,k+1 ) ɛ 2,j j,k p c i (x j,k+1 ) p c 1 c i (x j,k ) ɛ 3,j µ i (x j,k ) c i (x j,k+1 ) j c i (x j,k ) ɛ 4,j i=1 i=1 where {(ɛ i,j ) j N } i {1,...,4} and (µ j ) j N are strictly positive sequences converging to 0 such that ( i {1,..., 4}) lim ɛ i,j /µ j = 0. j Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 4

40 Tree Groundtruth No reg ADMM No reg ADMM PDS 4.73 GFBS 4.76 PIPA Table SNR (db) PDS GFBS PIPA Figure Abundance map of tree after 13 sec. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 4

41 Roof Groundtruth No reg ADMM No reg ADMM PDS 6.63 GFBS 7.66 PIPA Table SNR (db) PDS GFBS PIPA Figure Abundance map of roof after 13 sec. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 4

42 Metal Groundtruth No reg ADMM No reg 4.90 ADMM 7.57 PDS GFBS 0.05 PIPA 7.06 Table SNR (db) PDS GFBS PIPA Figure Abundance map of metal after 13 sec. Corbineau, Chouzenoux, Pesquet PIPA: Proximal Interior Point Algorithm ICASSP / 4

Geometry-Texture Decomposition/Reconstruction Using a Proximal Interior Point Algorithm

Geometry-Texture Decomposition/Reconstruction Using a Proximal Interior Point Algorithm Geometry-Texture Decomposition/Reconstruction Using a Proximal Interior Point Algorithm Marie-Caroline Corbineau, Emilie Chouzenoux,2, Jean-Christophe Pesquet CVN, CentraleSupélec, INRIA, Université Paris-Saclay,

More information

Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles

Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles INTERNATIONAL JOURNAL OF MATHEMATICS MODELS AND METHODS IN APPLIED SCIENCES Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles M. Fernanda P.

More information

Learning with infinitely many features

Learning with infinitely many features Learning with infinitely many features R. Flamary, Joint work with A. Rakotomamonjy F. Yger, M. Volpi, M. Dalla Mura, D. Tuia Laboratoire Lagrange, Université de Nice Sophia Antipolis December 2012 Example

More information

Composite Self-concordant Minimization

Composite Self-concordant Minimization Composite Self-concordant Minimization Volkan Cevher Laboratory for Information and Inference Systems-LIONS Ecole Polytechnique Federale de Lausanne (EPFL) volkan.cevher@epfl.ch Paris 6 Dec 11, 2013 joint

More information

Convex Optimization / Homework 2, due Oct 3

Convex Optimization / Homework 2, due Oct 3 Convex Optimization 0-725/36-725 Homework 2, due Oct 3 Instructions: You must complete Problems 3 and either Problem 4 or Problem 5 (your choice between the two) When you submit the homework, upload a

More information

Parallel and Distributed Sparse Optimization Algorithms

Parallel and Distributed Sparse Optimization Algorithms Parallel and Distributed Sparse Optimization Algorithms Part I Ruoyu Li 1 1 Department of Computer Science and Engineering University of Texas at Arlington March 19, 2015 Ruoyu Li (UTA) Parallel and Distributed

More information

Convex optimization algorithms for sparse and low-rank representations

Convex optimization algorithms for sparse and low-rank representations Convex optimization algorithms for sparse and low-rank representations Lieven Vandenberghe, Hsiao-Han Chao (UCLA) ECC 2013 Tutorial Session Sparse and low-rank representation methods in control, estimation,

More information

Efficient MR Image Reconstruction for Compressed MR Imaging

Efficient MR Image Reconstruction for Compressed MR Imaging Efficient MR Image Reconstruction for Compressed MR Imaging Junzhou Huang, Shaoting Zhang, and Dimitris Metaxas Division of Computer and Information Sciences, Rutgers University, NJ, USA 08854 Abstract.

More information

A primal-dual framework for mixtures of regularizers

A primal-dual framework for mixtures of regularizers A primal-dual framework for mixtures of regularizers Baran Gözcü baran.goezcue@epfl.ch Laboratory for Information and Inference Systems (LIONS) École Polytechnique Fédérale de Lausanne (EPFL) Switzerland

More information

Convexity Theory and Gradient Methods

Convexity Theory and Gradient Methods Convexity Theory and Gradient Methods Angelia Nedić angelia@illinois.edu ISE Department and Coordinated Science Laboratory University of Illinois at Urbana-Champaign Outline Convex Functions Optimality

More information

Performance Evaluation of an Interior Point Filter Line Search Method for Constrained Optimization

Performance Evaluation of an Interior Point Filter Line Search Method for Constrained Optimization 6th WSEAS International Conference on SYSTEM SCIENCE and SIMULATION in ENGINEERING, Venice, Italy, November 21-23, 2007 18 Performance Evaluation of an Interior Point Filter Line Search Method for Constrained

More information

Proximal operator and methods

Proximal operator and methods Proximal operator and methods Master 2 Data Science, Univ. Paris Saclay Robert M. Gower Optimization Sum of Terms A Datum Function Finite Sum Training Problem The Training Problem Convergence GD I Theorem

More information

A Truncated Newton Method in an Augmented Lagrangian Framework for Nonlinear Programming

A Truncated Newton Method in an Augmented Lagrangian Framework for Nonlinear Programming A Truncated Newton Method in an Augmented Lagrangian Framework for Nonlinear Programming Gianni Di Pillo (dipillo@dis.uniroma1.it) Giampaolo Liuzzi (liuzzi@iasi.cnr.it) Stefano Lucidi (lucidi@dis.uniroma1.it)

More information

ACCELERATED DUAL GRADIENT-BASED METHODS FOR TOTAL VARIATION IMAGE DENOISING/DEBLURRING PROBLEMS. Donghwan Kim and Jeffrey A.

ACCELERATED DUAL GRADIENT-BASED METHODS FOR TOTAL VARIATION IMAGE DENOISING/DEBLURRING PROBLEMS. Donghwan Kim and Jeffrey A. ACCELERATED DUAL GRADIENT-BASED METHODS FOR TOTAL VARIATION IMAGE DENOISING/DEBLURRING PROBLEMS Donghwan Kim and Jeffrey A. Fessler University of Michigan Dept. of Electrical Engineering and Computer Science

More information

Distance-to-Solution Estimates for Optimization Problems with Constraints in Standard Form

Distance-to-Solution Estimates for Optimization Problems with Constraints in Standard Form Distance-to-Solution Estimates for Optimization Problems with Constraints in Standard Form Philip E. Gill Vyacheslav Kungurtsev Daniel P. Robinson UCSD Center for Computational Mathematics Technical Report

More information

Lecture 18: March 23

Lecture 18: March 23 0-725/36-725: Convex Optimization Spring 205 Lecturer: Ryan Tibshirani Lecture 8: March 23 Scribes: James Duyck Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes have not

More information

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited. page v Preface xiii I Basics 1 1 Optimization Models 3 1.1 Introduction... 3 1.2 Optimization: An Informal Introduction... 4 1.3 Linear Equations... 7 1.4 Linear Optimization... 10 Exercises... 12 1.5

More information

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING KELLER VANDEBOGERT AND CHARLES LANNING 1. Introduction Interior point methods are, put simply, a technique of optimization where, given a problem

More information

IE598 Big Data Optimization Summary Nonconvex Optimization

IE598 Big Data Optimization Summary Nonconvex Optimization IE598 Big Data Optimization Summary Nonconvex Optimization Instructor: Niao He April 16, 2018 1 This Course Big Data Optimization Explore modern optimization theories, algorithms, and big data applications

More information

Lecture 19: Convex Non-Smooth Optimization. April 2, 2007

Lecture 19: Convex Non-Smooth Optimization. April 2, 2007 : Convex Non-Smooth Optimization April 2, 2007 Outline Lecture 19 Convex non-smooth problems Examples Subgradients and subdifferentials Subgradient properties Operations with subgradients and subdifferentials

More information

Characterizing Improving Directions Unconstrained Optimization

Characterizing Improving Directions Unconstrained Optimization Final Review IE417 In the Beginning... In the beginning, Weierstrass's theorem said that a continuous function achieves a minimum on a compact set. Using this, we showed that for a convex set S and y not

More information

A MRF Shape Prior for Facade Parsing with Occlusions Supplementary Material

A MRF Shape Prior for Facade Parsing with Occlusions Supplementary Material A MRF Shape Prior for Facade Parsing with Occlusions Supplementary Material Mateusz Koziński, Raghudeep Gadde, Sergey Zagoruyko, Guillaume Obozinski and Renaud Marlet Université Paris-Est, LIGM (UMR CNRS

More information

An augmented Lagrangian method for equality constrained optimization with fast infeasibility detection

An augmented Lagrangian method for equality constrained optimization with fast infeasibility detection An augmented Lagrangian method for equality constrained optimization with fast infeasibility detection Paul Armand 1 Ngoc Nguyen Tran 2 Institut de Recherche XLIM Université de Limoges Journées annuelles

More information

1. Introduction. performance of numerical methods. complexity bounds. structural convex optimization. course goals and topics

1. Introduction. performance of numerical methods. complexity bounds. structural convex optimization. course goals and topics 1. Introduction EE 546, Univ of Washington, Spring 2016 performance of numerical methods complexity bounds structural convex optimization course goals and topics 1 1 Some course info Welcome to EE 546!

More information

Projection-Based Methods in Optimization

Projection-Based Methods in Optimization Projection-Based Methods in Optimization Charles Byrne (Charles Byrne@uml.edu) http://faculty.uml.edu/cbyrne/cbyrne.html Department of Mathematical Sciences University of Massachusetts Lowell Lowell, MA

More information

Supplemental Material for Efficient MR Image Reconstruction for Compressed MR Imaging

Supplemental Material for Efficient MR Image Reconstruction for Compressed MR Imaging Supplemental Material for Efficient MR Image Reconstruction for Compressed MR Imaging Paper ID: 1195 No Institute Given 1 More Experiment Results 1.1 Visual Comparisons We apply all methods on four 2D

More information

Recent Advances inelectrical & Electronic Engineering

Recent Advances inelectrical & Electronic Engineering 4 Send Orders for Reprints to reprints@benthamscience.ae Recent Advances in Electrical & Electronic Engineering, 017, 10, 4-47 RESEARCH ARTICLE ISSN: 35-0965 eissn: 35-0973 Image Inpainting Method Based

More information

Hyperspectral Unmixing on GPUs and Multi-Core Processors: A Comparison

Hyperspectral Unmixing on GPUs and Multi-Core Processors: A Comparison Hyperspectral Unmixing on GPUs and Multi-Core Processors: A Comparison Dept. of Mechanical and Environmental Informatics Kimura-Nakao lab. Uehara Daiki Today s outline 1. Self-introduction 2. Basics of

More information

A Multilevel Proximal Gradient Algorithm for a Class of Composite Optimization Problems

A Multilevel Proximal Gradient Algorithm for a Class of Composite Optimization Problems A Multilevel Proximal Gradient Algorithm for a Class of Composite Optimization Problems Panos Parpas May 9, 2017 Abstract Composite optimization models consist of the minimization of the sum of a smooth

More information

The Alternating Direction Method of Multipliers

The Alternating Direction Method of Multipliers The Alternating Direction Method of Multipliers With Adaptive Step Size Selection Peter Sutor, Jr. Project Advisor: Professor Tom Goldstein October 8, 2015 1 / 30 Introduction Presentation Outline 1 Convex

More information

ELEG Compressive Sensing and Sparse Signal Representations

ELEG Compressive Sensing and Sparse Signal Representations ELEG 867 - Compressive Sensing and Sparse Signal Representations Gonzalo R. Arce Depart. of Electrical and Computer Engineering University of Delaware Fall 211 Compressive Sensing G. Arce Fall, 211 1 /

More information

arxiv: v1 [eess.iv] 3 Jun 2018

arxiv: v1 [eess.iv] 3 Jun 2018 Hyperspectral Unmixing by Nuclear Norm Difference Maximization based Dictionary Pruning Samiran Das 1, Aurobinda Routray 2 and Alok Kanti Deb 2 1 Advanced Technology and Development Center, Indian Institute

More information

Augmented Lagrangian Methods

Augmented Lagrangian Methods Augmented Lagrangian Methods Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Augmented Lagrangian IMA, August 2016 1 /

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June

More information

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer David G. Luenberger Yinyu Ye Linear and Nonlinear Programming Fourth Edition ö Springer Contents 1 Introduction 1 1.1 Optimization 1 1.2 Types of Problems 2 1.3 Size of Problems 5 1.4 Iterative Algorithms

More information

Optimization for Machine Learning

Optimization for Machine Learning Optimization for Machine Learning (Problems; Algorithms - C) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Second Order Optimization Methods Marc Toussaint U Stuttgart Planned Outline Gradient-based optimization (1st order methods) plain grad., steepest descent, conjugate grad.,

More information

Augmented Lagrangian Methods

Augmented Lagrangian Methods Augmented Lagrangian Methods Mário A. T. Figueiredo 1 and Stephen J. Wright 2 1 Instituto de Telecomunicações, Instituto Superior Técnico, Lisboa, Portugal 2 Computer Sciences Department, University of

More information

Section 5 Convex Optimisation 1. W. Dai (IC) EE4.66 Data Proc. Convex Optimisation page 5-1

Section 5 Convex Optimisation 1. W. Dai (IC) EE4.66 Data Proc. Convex Optimisation page 5-1 Section 5 Convex Optimisation 1 W. Dai (IC) EE4.66 Data Proc. Convex Optimisation 1 2018 page 5-1 Convex Combination Denition 5.1 A convex combination is a linear combination of points where all coecients

More information

The Benefit of Tree Sparsity in Accelerated MRI

The Benefit of Tree Sparsity in Accelerated MRI The Benefit of Tree Sparsity in Accelerated MRI Chen Chen and Junzhou Huang Department of Computer Science and Engineering, The University of Texas at Arlington, TX, USA 76019 Abstract. The wavelet coefficients

More information

Image Restoration with Compound Regularization Using a Bregman Iterative Algorithm

Image Restoration with Compound Regularization Using a Bregman Iterative Algorithm Image Restoration with Compound Regularization Using a Bregman Iterative Algorithm Manya V. Afonso, José M. Bioucas-Dias, Mario A. T. Figueiredo To cite this version: Manya V. Afonso, José M. Bioucas-Dias,

More information

An Enhanced Primal Dual Method For Total Variation-Based Wavelet Domain Inpainting

An Enhanced Primal Dual Method For Total Variation-Based Wavelet Domain Inpainting An Enhanced Primal Dual Method For Total Variation-Based Wavelet Domain Inpainting P. Sravan Kumar 1, H.V. Ram Ravi Kishore 2, V. Ravi Kumar 3 1 MTech, Spatial Information Technology, from JNTU, research

More information

An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems

An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems Kim-Chuan Toh Sangwoon Yun Abstract The affine rank minimization problem, which consists of finding a matrix

More information

Total Variation Denoising with Overlapping Group Sparsity

Total Variation Denoising with Overlapping Group Sparsity 1 Total Variation Denoising with Overlapping Group Sparsity Ivan W. Selesnick and Po-Yu Chen Polytechnic Institute of New York University Brooklyn, New York selesi@poly.edu 2 Abstract This paper describes

More information

Convex or non-convex: which is better?

Convex or non-convex: which is better? Sparsity Amplified Ivan Selesnick Electrical and Computer Engineering Tandon School of Engineering New York University Brooklyn, New York March 7 / 4 Convex or non-convex: which is better? (for sparse-regularized

More information

Application of Proximal Algorithms to Three Dimensional Deconvolution Microscopy

Application of Proximal Algorithms to Three Dimensional Deconvolution Microscopy Application of Proximal Algorithms to Three Dimensional Deconvolution Microscopy Paroma Varma Stanford University paroma@stanford.edu Abstract In microscopy, shot noise dominates image formation, which

More information

WE consider the gate-sizing problem, that is, the problem

WE consider the gate-sizing problem, that is, the problem 2760 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I: REGULAR PAPERS, VOL 55, NO 9, OCTOBER 2008 An Efficient Method for Large-Scale Gate Sizing Siddharth Joshi and Stephen Boyd, Fellow, IEEE Abstract We consider

More information

SEQUENTIAL IMAGE COMPLETION FOR HIGH-SPEED LARGE-PIXEL NUMBER SENSING

SEQUENTIAL IMAGE COMPLETION FOR HIGH-SPEED LARGE-PIXEL NUMBER SENSING SEQUENTIAL IMAGE COMPLETION FOR HIGH-SPEED LARGE-PIXEL NUMBER SENSING Akira Hirabayashi Naoki Nogami Takashi Ijiri Laurent Condat Ritsumeikan University College Info. Science & Eng. Kusatsu, Shiga 525-8577,

More information

Nonlinear Programming

Nonlinear Programming Nonlinear Programming SECOND EDITION Dimitri P. Bertsekas Massachusetts Institute of Technology WWW site for book Information and Orders http://world.std.com/~athenasc/index.html Athena Scientific, Belmont,

More information

New Methods for Solving Large Scale Linear Programming Problems in the Windows and Linux computer operating systems

New Methods for Solving Large Scale Linear Programming Problems in the Windows and Linux computer operating systems arxiv:1209.4308v1 [math.oc] 19 Sep 2012 New Methods for Solving Large Scale Linear Programming Problems in the Windows and Linux computer operating systems Saeed Ketabchi, Hossein Moosaei, Hossein Sahleh

More information

A Hierarchical Convex Optimization Approach for High Fidelity Solution Selection in Image Recovery

A Hierarchical Convex Optimization Approach for High Fidelity Solution Selection in Image Recovery A Hierarchical Convex Optimization Approach for High Fidelity Solution Selection in Image Recovery Shunsuke ONO and Isao YAMAA ept. Communications and Integrated Systems, Tokyo Institute of Technology,

More information

A SOFT CONSTRAINED MAP ESTIMATOR FOR SUPERVISED HYPERSPECTRAL SIGNAL UNMIXING

A SOFT CONSTRAINED MAP ESTIMATOR FOR SUPERVISED HYPERSPECTRAL SIGNAL UNMIXING 6th European Signal Processing Conference (EUSIPCO 8), Lausanne, Switzerland, August 25-29, 8, copyright by EURASIP A SOFT CONSTRAINED MAP ESTIMATOR FOR SUPERVISED HYPERSPECTRAL SIGNAL UNMIXING Kostas

More information

A More Efficient Approach to Large Scale Matrix Completion Problems

A More Efficient Approach to Large Scale Matrix Completion Problems A More Efficient Approach to Large Scale Matrix Completion Problems Matthew Olson August 25, 2014 Abstract This paper investigates a scalable optimization procedure to the low-rank matrix completion problem

More information

Iteratively Re-weighted Least Squares for Sums of Convex Functions

Iteratively Re-weighted Least Squares for Sums of Convex Functions Iteratively Re-weighted Least Squares for Sums of Convex Functions James Burke University of Washington Jiashan Wang LinkedIn Frank Curtis Lehigh University Hao Wang Shanghai Tech University Daiwei He

More information

Iterative Shrinkage/Thresholding g Algorithms: Some History and Recent Development

Iterative Shrinkage/Thresholding g Algorithms: Some History and Recent Development Iterative Shrinkage/Thresholding g Algorithms: Some History and Recent Development Mário A. T. Figueiredo Instituto de Telecomunicações and Instituto Superior Técnico, Technical University of Lisbon PORTUGAL

More information

Lower bounds on the barrier parameter of convex cones

Lower bounds on the barrier parameter of convex cones of convex cones Université Grenoble 1 / CNRS June 20, 2012 / High Performance Optimization 2012, Delft Outline Logarithmically homogeneous barriers 1 Logarithmically homogeneous barriers Conic optimization

More information

Lecture 4 Duality and Decomposition Techniques

Lecture 4 Duality and Decomposition Techniques Lecture 4 Duality and Decomposition Techniques Jie Lu (jielu@kth.se) Richard Combes Alexandre Proutiere Automatic Control, KTH September 19, 2013 Consider the primal problem Lagrange Duality Lagrangian

More information

PARALLEL OPTIMIZATION

PARALLEL OPTIMIZATION PARALLEL OPTIMIZATION Theory, Algorithms, and Applications YAIR CENSOR Department of Mathematics and Computer Science University of Haifa STAVROS A. ZENIOS Department of Public and Business Administration

More information

The Trainable Alternating Gradient Shrinkage method

The Trainable Alternating Gradient Shrinkage method The Trainable Alternating Gradient Shrinkage method Carlos Malanche, supervised by Ha Nguyen April 24, 2018 LIB (prof. Michael Unser), EPFL Quick example 1 The reconstruction problem Challenges in image

More information

Bilinear Programming

Bilinear Programming Bilinear Programming Artyom G. Nahapetyan Center for Applied Optimization Industrial and Systems Engineering Department University of Florida Gainesville, Florida 32611-6595 Email address: artyom@ufl.edu

More information

Recent Developments in Model-based Derivative-free Optimization

Recent Developments in Model-based Derivative-free Optimization Recent Developments in Model-based Derivative-free Optimization Seppo Pulkkinen April 23, 2010 Introduction Problem definition The problem we are considering is a nonlinear optimization problem with constraints:

More information

Mathematical Programming and Research Methods (Part II)

Mathematical Programming and Research Methods (Part II) Mathematical Programming and Research Methods (Part II) 4. Convexity and Optimization Massimiliano Pontil (based on previous lecture by Andreas Argyriou) 1 Today s Plan Convex sets and functions Types

More information

Convex Optimization MLSS 2015

Convex Optimization MLSS 2015 Convex Optimization MLSS 2015 Constantine Caramanis The University of Texas at Austin The Optimization Problem minimize : f (x) subject to : x X. The Optimization Problem minimize : f (x) subject to :

More information

Numerical Optimization

Numerical Optimization Numerical Optimization Quantitative Macroeconomics Raül Santaeulàlia-Llopis MOVE-UAB and Barcelona GSE Fall 2018 Raül Santaeulàlia-Llopis (MOVE-UAB,BGSE) QM: Numerical Optimization Fall 2018 1 / 46 1 Introduction

More information

Advanced phase retrieval: maximum likelihood technique with sparse regularization of phase and amplitude

Advanced phase retrieval: maximum likelihood technique with sparse regularization of phase and amplitude Advanced phase retrieval: maximum likelihood technique with sparse regularization of phase and amplitude A. Migukin *, V. atkovnik and J. Astola Department of Signal Processing, Tampere University of Technology,

More information

A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009]

A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009] A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009] Yongjia Song University of Wisconsin-Madison April 22, 2010 Yongjia Song

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 2 Convex Optimization Shiqian Ma, MAT-258A: Numerical Optimization 2 2.1. Convex Optimization General optimization problem: min f 0 (x) s.t., f i

More information

Optimized Choice of Parameters in interior-point methods for linear programming. Luiz-Rafael dos Santos

Optimized Choice of Parameters in interior-point methods for linear programming. Luiz-Rafael dos Santos Optimized Choice of Parameters in interior-point methods for linear programming Luiz-Rafael dos Santos Joint work with F. Villas-Bôas, A. Oliveira and C. Perin 1st Brazilian Workshop on Interior Point

More information

Iterative regularization via a dual diagonal descent method

Iterative regularization via a dual diagonal descent method Iterative regularization via a dual diagonal descent method Guillaume Garrigos LCSL, Istituto Italiano di Tecnologia and Massachusetts Institute of Technology Cambridge, MA 02139, USA guillaume.garrigos@iit.it

More information

Finding a Needle in a Haystack: An Image Processing Approach

Finding a Needle in a Haystack: An Image Processing Approach Finding a Needle in a Haystack: An Image Processing Approach Emily Beylerian Advisor: Hayden Schaeffer University of California, Los Angeles Department of Mathematics Abstract Image segmentation (also

More information

Robust Principal Component Analysis (RPCA)

Robust Principal Component Analysis (RPCA) Robust Principal Component Analysis (RPCA) & Matrix decomposition: into low-rank and sparse components Zhenfang Hu 2010.4.1 reference [1] Chandrasekharan, V., Sanghavi, S., Parillo, P., Wilsky, A.: Ranksparsity

More information

The Alternating Direction Method of Multipliers

The Alternating Direction Method of Multipliers The Alternating Direction Method of Multipliers Customizable software solver package Peter Sutor, Jr. Project Advisor: Professor Tom Goldstein April 27, 2016 1 / 28 Background The Dual Problem Consider

More information

Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization. Author: Martin Jaggi Presenter: Zhongxing Peng

Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization. Author: Martin Jaggi Presenter: Zhongxing Peng Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization Author: Martin Jaggi Presenter: Zhongxing Peng Outline 1. Theoretical Results 2. Applications Outline 1. Theoretical Results 2. Applications

More information

Bread Water Content Measurement Based on Hyperspectral Imaging

Bread Water Content Measurement Based on Hyperspectral Imaging 93 Bread Water Content Measurement Based on Hyperspectral Imaging Zhi Liu 1, Flemming Møller 1.2 1 Department of Informatics and Mathematical Modelling, Technical University of Denmark, Kgs. Lyngby, Denmark

More information

hal , version 2-20 Apr 2013

hal , version 2-20 Apr 2013 Author manuscript, published in "SIAM Journal on Imaging Sciences To appear, -- (2013) 1-27" DOI : 10.1137/120895068 DUAL CONSTRAINED TV-BASED REGULARIZATION ON GRAPHS CAMILLE COUPRIE, LEO GRADY, LAURENT

More information

A new robust watermarking scheme based on PDE decomposition *

A new robust watermarking scheme based on PDE decomposition * A new robust watermarking scheme based on PDE decomposition * Noura Aherrahrou University Sidi Mohamed Ben Abdellah Faculty of Sciences Dhar El mahraz LIIAN, Department of Informatics Fez, Morocco Hamid

More information

Computational Methods. Constrained Optimization

Computational Methods. Constrained Optimization Computational Methods Constrained Optimization Manfred Huber 2010 1 Constrained Optimization Unconstrained Optimization finds a minimum of a function under the assumption that the parameters can take on

More information

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 7, JULY (1) 1 A comprehensive, and frequently updated repository of CS literature and

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 7, JULY (1) 1 A comprehensive, and frequently updated repository of CS literature and IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 7, JULY 2009 2479 Sparse Reconstruction by Separable Approximation Stephen J. Wright, Robert D. Nowak, Senior Member, IEEE, and Mário A. T. Figueiredo,

More information

Image Restoration using Accelerated Proximal Gradient method

Image Restoration using Accelerated Proximal Gradient method Image Restoration using Accelerated Proximal Gradient method Alluri.Samuyl Department of computer science, KIET Engineering College. D.Srinivas Asso.prof, Department of computer science, KIET Engineering

More information

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference Minh Dao 1, Xiang Xiang 1, Bulent Ayhan 2, Chiman Kwan 2, Trac D. Tran 1 Johns Hopkins Univeristy, 3400

More information

Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2

Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2 Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2 X. Zhao 3, P. B. Luh 4, and J. Wang 5 Communicated by W.B. Gong and D. D. Yao 1 This paper is dedicated to Professor Yu-Chi Ho for his 65th birthday.

More information

Algorithms for convex optimization

Algorithms for convex optimization Algorithms for convex optimization Michal Kočvara Institute of Information Theory and Automation Academy of Sciences of the Czech Republic and Czech Technical University kocvara@utia.cas.cz http://www.utia.cas.cz/kocvara

More information

Lagrangian methods for the regularization of discrete ill-posed problems. G. Landi

Lagrangian methods for the regularization of discrete ill-posed problems. G. Landi Lagrangian methods for the regularization of discrete ill-posed problems G. Landi Abstract In many science and engineering applications, the discretization of linear illposed problems gives rise to large

More information

A Novel Image Super-resolution Reconstruction Algorithm based on Modified Sparse Representation

A Novel Image Super-resolution Reconstruction Algorithm based on Modified Sparse Representation , pp.162-167 http://dx.doi.org/10.14257/astl.2016.138.33 A Novel Image Super-resolution Reconstruction Algorithm based on Modified Sparse Representation Liqiang Hu, Chaofeng He Shijiazhuang Tiedao University,

More information

A Splitting Primal-dual Proximity Algorithm for Solving Composite Optimization Problems

A Splitting Primal-dual Proximity Algorithm for Solving Composite Optimization Problems Acta Mathematica Sinica, English Series Jun., 207, Vol. 33, No. 6, pp. 868 886 Published online: November 22, 206 DOI: 0.007/s04-06-5625-x Http://www.ActaMath.com Acta Mathematica Sinica, English Series

More information

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach Basic approaches I. Primal Approach - Feasible Direction

More information

Lecture 19 Subgradient Methods. November 5, 2008

Lecture 19 Subgradient Methods. November 5, 2008 Subgradient Methods November 5, 2008 Outline Lecture 19 Subgradients and Level Sets Subgradient Method Convergence and Convergence Rate Convex Optimization 1 Subgradients and Level Sets A vector s is a

More information

SDLS: a Matlab package for solving conic least-squares problems

SDLS: a Matlab package for solving conic least-squares problems SDLS: a Matlab package for solving conic least-squares problems Didier Henrion 1,2 Jérôme Malick 3 June 28, 2007 Abstract This document is an introduction to the Matlab package SDLS (Semi-Definite Least-Squares)

More information

Limitations of Matrix Completion via Trace Norm Minimization

Limitations of Matrix Completion via Trace Norm Minimization Limitations of Matrix Completion via Trace Norm Minimization ABSTRACT Xiaoxiao Shi Computer Science Department University of Illinois at Chicago xiaoxiao@cs.uic.edu In recent years, compressive sensing

More information

Compressed Sensing and L 1 -Related Minimization

Compressed Sensing and L 1 -Related Minimization Compressed Sensing and L 1 -Related Minimization Yin Wotao Computational and Applied Mathematics Rice University Jan 4, 2008 Chinese Academy of Sciences Inst. Comp. Math The Problems of Interest Unconstrained

More information

THE COMPUTER MODELLING OF GLUING FLAT IMAGES ALGORITHMS. Alekseí Yu. Chekunov. 1. Introduction

THE COMPUTER MODELLING OF GLUING FLAT IMAGES ALGORITHMS. Alekseí Yu. Chekunov. 1. Introduction MATEMATIQKI VESNIK Corrected proof Available online 01.10.2016 originalni nauqni rad research paper THE COMPUTER MODELLING OF GLUING FLAT IMAGES ALGORITHMS Alekseí Yu. Chekunov Abstract. In this paper

More information

Generalized trace ratio optimization and applications

Generalized trace ratio optimization and applications Generalized trace ratio optimization and applications Mohammed Bellalij, Saïd Hanafi, Rita Macedo and Raca Todosijevic University of Valenciennes, France PGMO Days, 2-4 October 2013 ENSTA ParisTech PGMO

More information

Bilevel Sparse Coding

Bilevel Sparse Coding Adobe Research 345 Park Ave, San Jose, CA Mar 15, 2013 Outline 1 2 The learning model The learning algorithm 3 4 Sparse Modeling Many types of sensory data, e.g., images and audio, are in high-dimensional

More information

Conic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding

Conic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding Conic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding B. O Donoghue E. Chu N. Parikh S. Boyd Convex Optimization and Beyond, Edinburgh, 11/6/2104 1 Outline Cone programming Homogeneous

More information

Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems

Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems Journal of Electrical Engineering 6 (2018) 124-128 doi: 10.17265/2328-2223/2018.02.009 D DAVID PUBLISHING Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems Sina Mortazavi and

More information

Algebraic Iterative Methods for Computed Tomography

Algebraic Iterative Methods for Computed Tomography Algebraic Iterative Methods for Computed Tomography Per Christian Hansen DTU Compute Department of Applied Mathematics and Computer Science Technical University of Denmark Per Christian Hansen Algebraic

More information

On a first-order primal-dual algorithm

On a first-order primal-dual algorithm On a first-order primal-dual algorithm Thomas Pock 1 and Antonin Chambolle 2 1 Institute for Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria 2 Centre de Mathématiques Appliquées,

More information

ML Detection via SDP Relaxation

ML Detection via SDP Relaxation ML Detection via SDP Relaxation Junxiao Song and Daniel P. Palomar The Hong Kong University of Science and Technology (HKUST) ELEC 5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture

More information

Code Generation for Embedded Convex Optimization

Code Generation for Embedded Convex Optimization Code Generation for Embedded Convex Optimization Jacob Mattingley Stanford University October 2010 Convex optimization Problems solvable reliably and efficiently Widely used in scheduling, finance, engineering

More information

Applied Lagrange Duality for Constrained Optimization

Applied Lagrange Duality for Constrained Optimization Applied Lagrange Duality for Constrained Optimization Robert M. Freund February 10, 2004 c 2004 Massachusetts Institute of Technology. 1 1 Overview The Practical Importance of Duality Review of Convexity

More information