if nargin < 2, niter = 80; end if nargin < 3, beta = 2^4; end % defaults appropriate for HW problem if nargin < 4, delta = 0.

Similar documents
Building Geometries in Data Arrays

Topic 2b Building Geometries into Data Arrays

A Brief Introduction to MATLAB

ComputerLab: compressive sensing and application to MRI

ECE251DN: Homework #3 Solutions

Practical 4: The Integrate & Fire neuron

Logical Subscripting: This kind of subscripting can be done in one step by specifying the logical operation as the subscripting expression.

Octave Tutorial Machine Learning WS 12/13 Umer Khan Information Systems and Machine Learning Lab (ISMLL) University of Hildesheim, Germany

matlab_intro.html Page 1 of 5 Date: Tuesday, September 6, 2005

Computing Fundamentals Plotting

Introduction to MATLAB LAB 1

Plotting - Practice session

Edge detection. Convert a 2D image into a set of curves. Extracts salient features of the scene More compact than pixels

Edge Detection. Announcements. Edge detection. Origin of Edges. Mailing list: you should have received messages

Lab 4: Automatical thresholding and simple OCR

A very brief Matlab introduction


An Introduction to MATLAB II

PERI INSTITUTE OF TECHNOLOGY DEPARTMENT OF ECE TWO DAYS NATIONAL LEVEL WORKSHOP ON COMMUNICATIONS & IMAGE PROCESSING "CIPM 2017" Matlab Fun - 2

EGR 102 Introduction to Engineering Modeling. Lab 05B Plotting

Problems with template matching

INTERNATIONAL EDITION. MATLAB for Engineers. Third Edition. Holly Moore

GRAPHICS AND VISUALISATION WITH MATLAB

Bioimage Informatics

Basic MATLAB Intro III

MATH2070: LAB 4: Newton s method

TOPIC 6 Computer application for drawing 2D Graph

Rotation and Interpolation

INTRODUCTION TO MATLAB PLOTTING WITH MATLAB

Mechanical Engineering Department Second Year (2015)

Continuous time Markov chains (week 10) Solutions

Introduction to Programming in MATLAB

Linear models. Subhransu Maji. CMPSCI 689: Machine Learning. 24 February February 2015

MATLAB BASICS. < Any system: Enter quit at Matlab prompt < PC/Windows: Close command window < To interrupt execution: Enter Ctrl-c.

EECS490: Digital Image Processing. Lecture #21

Desktop Command window

EE368/CS232 Digital Image Processing Winter Homework #4 Solutions

2D LINE PLOTS... 1 The plot() Command... 1 Labeling and Annotating Figures... 5 The subplot() Command... 7 The polarplot() Command...

Chapter 1. Linear Equations and Straight Lines. 2 of 71. Copyright 2014, 2010, 2007 Pearson Education, Inc.

Edge Detection. CSE 576 Ali Farhadi. Many slides from Steve Seitz and Larry Zitnick

ECE 661 HW # 5 Joonsoo Kim(PUID : )

Plotting using Matlab. Vytautas Astromskas

Data and Function Plotting with MATLAB (Linux-10)

Factor the following completely:

WIME toolbox for Scilab

Lecture 7: Most Common Edge Detectors

3.3 Optimizing Functions of Several Variables 3.4 Lagrange Multipliers

CSE152 Introduction to Computer Vision Assignment 3 (SP15) Instructor: Ben Ochoa Maximum Points : 85 Deadline : 11:59 p.m., Friday, 29-May-2015

Edge detection. Gradient-based edge operators

PC-MATLAB PRIMER. This is intended as a guided tour through PCMATLAB. Type as you go and watch what happens.

Going nonparametric: Nearest neighbor methods for regression and classification

Announcements. Edges. Last Lecture. Gradients: Numerical Derivatives f(x) Edge Detection, Lines. Intro Computer Vision. CSE 152 Lecture 10

Edge Detection. Today s reading. Cipolla & Gee on edge detection (available online) From Sandlot Science

Introduction to. The Help System. Variable and Memory Management. Matrices Generation. Interactive Calculations. Vectors and Matrices

Cluster Analysis for Microarray Data

Outline. Data Association Scenarios. Data Association Scenarios. Data Association Scenarios

SIFT: Scale Invariant Feature Transform

Basic plotting commands Types of plots Customizing plots graphically Specifying color Customizing plots programmatically Exporting figures

Laboratory 1 Octave Tutorial

Multimedia Computing: Algorithms, Systems, and Applications: Edge Detection

Matlab Introduction. Scalar Variables and Arithmetic Operators

Bi 1x Spring 2014: Plotting and linear regression

Ultrasound Elasticity Imaging

What is MATLAB? It is a high-level programming language. for numerical computations for symbolic computations for scientific visualizations

W1005 Intro to CS and Programming in MATLAB. Plo9ng & Visualiza?on. Fall 2014 Instructor: Ilia Vovsha. hgp://

Lesson 24: Matrix Notation Encompasses New Transformations!

CHAOS Chaos Chaos Iterate

Image Segmentation Image Thresholds Edge-detection Edge-detection, the 1 st derivative Edge-detection, the 2 nd derivative Horizontal Edges Vertical

The Lucas & Kanade Algorithm

Datenanalyse (PHY231) Herbstsemester 2017

16720 Computer Vision: Homework 3 Template Tracking and Layered Motion.

Large Scale Data Analysis Using Deep Learning

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW

3 Nonlinear Regression

CSE 123. Plots in MATLAB

Lab of COMP 406 Introduction of Matlab (II) Graphics and Visualization

Markov Random Fields and Segmentation with Graph Cuts

Page 1 of 7 E7 Spring 2009 Midterm I SID: UNIVERSITY OF CALIFORNIA, BERKELEY Department of Civil and Environmental Engineering. Practice Midterm 01

Package edci. May 16, 2018

How to learn MATLAB? Some predefined variables

EE 301 Lab 1 Introduction to MATLAB

Drawing fractals in a few lines of Matlab

Exploring Curve Fitting for Fingers in Egocentric Images

Graphics Example a final product:

MATLAB Modul 3. Introduction

Adaptive Multiple-Frame Image Super- Resolution Based on U-Curve

Fondamenti di Informatica Examples: Plotting 2013/06/13

SIFT: SCALE INVARIANT FEATURE TRANSFORM SURF: SPEEDED UP ROBUST FEATURES BASHAR ALSADIK EOS DEPT. TOPMAP M13 3D GEOINFORMATION FROM IMAGES 2014

Final Exam Assigned: 11/21/02 Due: 12/05/02 at 2:30pm

ME422 Mechanical Control Systems Matlab/Simulink Hints and Tips

LAB 1: Introduction to MATLAB Summer 2011

A Mini-Manual for GNUPLOT

Hello Earth! A grounded introduction to Matlab. Frederik J Simons. Christopher Harig. Adam C. Maloof Princeton University

Solutions For Homework #7

CS-465 Computer Vision

Basic Graphs. Dmitry Adamskiy 16 November 2011

Computer Vision I. Announcements. Fourier Tansform. Efficient Implementation. Edge and Corner Detection. CSE252A Lecture 13.

Scott Smith Advanced Image Processing March 15, Speeded-Up Robust Features SURF

EE 350. Continuous-Time Linear Systems. Recitation 1. 1

MATLAB Tutorial. Digital Signal Processing. Course Details. Topics. MATLAB Environment. Introduction. Digital Signal Processing (DSP)

Transcription:

2. (a) See attached code for npls2_sps.m (b) See attached code and plot. Your MSE should be 7.57(for NPLS) 5.11 (for NPLS2). MSE numbers can be different a little bit should be close to that. (c) See subplot. MSE_NPLS = [22.4768 7.5772 8.8435 10.1979 10.6519], MSE_NPLS2 = [10.4248 5.1109 4.5064 4.3954 4.3702], for beta = [4 16 64 256 1024]. MSE number can vary a little bit from these numbers. For NPLS, minimum MSE occurs at beta = 2^4, while for NPLS2, minimum MSE occurs at beta = 2^10. (d) NPLS2 (modified approach) has 2 nd order difference as the penalty function. The noiseless signal itself is a straight line which incurs no penalty under the 2 nd order difference while it incurs some penalty under the 1 st order difference. For this class of signal, 2 nd order penalty models the signal better, thus NPSL2 performs better than NPLS method. function xx = npls2_sps(yy, niter, beta, delta) %function xx = npls2_sps(yy, niter, beta, delta) % nonquadratic penalized least-squares de-noising of an image y % using separable paraboloidal surrogates (SPS) algorithm % yy image to be "de-noised" % niter # of iterations % beta # roughness penalty parameter % delta # roughness penalty parameter if nargin < 2, niter = 80; end if nargin < 3, beta = 2^4; end % defaults appropriate for HW problem if nargin < 4, delta = 0.5; end [nx,ny] = size(yy); C = buildc(nx,ny); % create penalty matrix for 1st-order differences % \omega "curvature" function for Lange3 penalty wt = inline(sprintf('1./ sqrt(1 + abs(x/ %19.18e).^2)', delta)); denom = 1 + beta * abs(c)' * abs(c) * ones(nx*ny,1); xx = yy(:); % initial guess, the noisy image - in a vector for ii=1:niter Cx = C * xx; xx = xx + (yy(:) - xx - beta * (C' * (wt(cx).* Cx)))./ denom; end xx = reshape(xx, size(yy)); % turn vector back into an image

% % Build a sparse matrix that computes first-order differences % between horizontal and vertical neighboring pixels. % function C = buildc(nx, ny) i = 1:(nx-2); j = 2:(nx-1); % row and column indices i = [i i i]; j = [j-1 j j+1]; s = ones(nx-2,1)*[-1 2-1]; % make non zero entries Cx = sparse(i, j, s); % matrix rows are [0... 0-1 2-1 0... 0] i = 1:(ny-2); j = 2:(ny-1); i = [i i i]; j = [j-1 j j+1]; s = ones(ny-2,1)*[-1 2-1]; % make non zero entries Cy = sparse(i, j, s); % matrix rows are [0... 0-1 2-1 0... 0] % make it apply to each row of image (respectively each column) % and combine horizontal and vertical penalties C = [kron(speye(ny), Cx); kron(cy, speye(nx))]; % hw9 prob2 : apply two verison of NPLS denoising nx = 64; ny = 50; xtrue = zeros(nx,ny); ix = -(nx-1)/2:(nx-1)/2; iy = -(ny-1)/2:(ny-1)/2; [ix, iy] = ndgrid(ix, iy); xtrue = (1 - min(abs(ix/(nx/3)),1)) * 200; xtrue = (1 - min(abs(iy/(ny/3)),1)).* xtrue; randn('seed', 0) % OMITTED FROM TEMPLATE :-( yy = xtrue + 10 * randn(size(xtrue)); % add gaussian noise clf, pl = 330; colormap(gray(256)) subplot(pl+1), imagesc(xtrue'), axis xy, axis image title 'x_{true}(n,m)', colorbar horiz subplot(pl+2), imagesc(yy'), axis xy, axis image title 'Noisy: y(n,m)', colorbar horiz niter=200; delta=1; beta1 = 2^4; beta2 = 2^4;

xhat1 = npls_sps(yy, niter, beta1, delta); xhat2 = npls2_sps(yy, niter, beta2, delta); beta_list = [2^2 2^4 2^6 2^8 2^10]; mse_npls1 = zeros(1, length(beta_list)); mse_npls2 = mse_npls1; for i = 1: length(beta_list), xhat_temp = npls_sps(yy, niter, beta_list(i), delta); mse_npls1(i) = mean2((xhat_temp-xtrue).^2); xhat_temp = npls2_sps(yy, niter, beta_list(i), delta); mse_npls2(i) = mean2((xhat_temp-xtrue).^2); end; subplot(pl+4), imagesc(xhat1'), axis xy, axis image xlabel n, ylabel m, title 'NPLS1', colorbar horiz subplot(pl+5), imagesc((xhat1-xtrue)'), axis xy, axis image xlabel n, ylabel m, colorbar horiz title(sprintf('npls1 error, MSE=%g', mean2((xhat1-xtrue).^2))) subplot(pl+7), imagesc(xhat2'), axis xy, axis image xlabel n, ylabel m, title 'NPLS2', colorbar horiz subplot(pl+8), imagesc((xhat2-xtrue)'), axis xy, axis image xlabel n, ylabel m, colorbar horiz title(sprintf('npls2 error, MSE=%g', mean2((xhat2-xtrue).^2))) subplot(pl+3) plot(1:nx, xtrue(:,ny/2), 'r:', 1:nx, xhat1(:,ny/2), 'c--', 1:nx, xhat2(:,ny/2), 'y-') axis tight, legend('xtrue', 'npls1', 'npls2', 3), title 'Profile plot'; subplot(pl+6) plot(1:5, mse_npls1,'r:',1:5, mse_npls2,'c--'); axis tight, xlabel '0.5log_2\beta', ylabel 'MSE', legend('npls1', 'npls2', 2)

3. see attached code. From the estimated R_y[n,m] we see that h[n,m] must be a 3x7 rectangle, so N=1 and M=3. Since Var{y[n,m]} = R_y[0,0] = c^2(h[n,m] h[n,m])[0,0], from the peak of estimated R_y[n,m] we compute that c 13.2 or -13.2 since (h[n,m] h[n,m])[0,0] = h[n,m] ^2 = 35. -5pts for trial and error approach. % acorr_find.m K = 128; h = ones(2*1+1,2*3+1); % N =1 and M =3 c = 13; randn('state', 9) y = c * conv2(randn(k,k), h, 'same'); subplot(221);imagesc(y), axis xy, axis image colorbar;colormap gray;title 'y[n,m]' r = xcorr2(y) / K^2; ii = [-(K-1):(K-1)]; subplot(222); imagesc(ii, ii, r); axis xy, axis image colorbar;title('r_y[n,m]');

ii = [-15:15]; subplot(223) plot(ii, r(k+ii,k), '-o'), xlabel 'n', axis tight; subplot(224) plot(ii, r(k,k+ii), '-o'), xlabel 'm', axis tight c_hat = sqrt(r(k,k) / sum(h(:).^2))