CS1114 Section: SIFT April 3, 2013

Similar documents
CS1114 Assignment 5, Part 2

Lab # 2 - ACS I Part I - DATA COMPRESSION in IMAGE PROCESSING using SVD

CS1114: Study Guide 2

ENGR 1181 MATLAB 02: Array Creation

APPM 2360 Lab #2: Facial Recognition

Object recognition of Glagolitic characters using Sift and Ransac

ECS 189G: Intro to Computer Vision, Spring 2015 Problem Set 3

Exercise 1: The Scale-Invariant Feature Transform

Image Manipulation in MATLAB Due Monday, July 17 at 5:00 PM

Local Features Tutorial: Nov. 8, 04

CS1114 Assignment 5, Part 1

Motion illusion, rotating snakes

Voronoi diagrams and applications

CSCI 5980/8980: Assignment #4. Fundamental Matrix

Programming Exercise 4: Neural Networks Learning

3D Object Recognition using Multiclass SVM-KNN

Object of interest discovery in video sequences

1 Lab: Digit recognition

Coarse-to-fine image registration

Unix Computer To open MATLAB on a Unix computer, click on K-Menu >> Caedm Local Apps >> MATLAB.

Introduction to MATLAB

Matlab for FMRI Module 1: the basics Instructor: Luis Hernandez-Garcia

Edge and corner detection

CS1114: Matlab Introduction

1 Background and Introduction 2. 2 Assessment 2

Features Points. Andrea Torsello DAIS Università Ca Foscari via Torino 155, Mestre (VE)

CS 231A CA Session: Problem Set 4 Review. Kevin Chen May 13, 2016

Instance-level recognition

Image Compression With Haar Discrete Wavelet Transform

Implementing the Scale Invariant Feature Transform(SIFT) Method

Computer Vision CSCI-GA Assignment 1.

General Instructions. Questions

UNIVERSITY OF OSLO. Faculty of Mathematics and Natural Sciences

Introduction to Matlab. CSE555 Introduction to Pattern Recognition Spring 2011 recitation

EE368/CS232 Digital Image Processing Winter

Shadows in the graphics pipeline

Finding MATLAB on CAEDM Computers

COSC160: Detection and Classification. Jeremy Bolton, PhD Assistant Teaching Professor

Final Exam Study Guide

Depatment of Computer Science Rutgers University CS443 Digital Imaging and Multimedia Assignment 4 Due Apr 15 th, 2008

Section 7.12: Similarity. By: Ralucca Gera, NPS

BSB663 Image Processing Pinar Duygulu. Slides are adapted from Selim Aksoy

APPM 2460 Matlab Basics

CS 231A Computer Vision (Winter 2018) Problem Set 3

Problem Set 4. Danfei Xu CS 231A March 9th, (Courtesy of last year s slides)

Designing Applications that See Lecture 7: Object Recognition

Lecture 6 Geometric Transformations and Image Registration. Lin ZHANG, PhD School of Software Engineering Tongji University Spring 2013

Contracts, Purpose Statements, Examples and Tests

APPM 2360 Project 2 Due Nov. 3 at 5:00 PM in D2L

PC-MATLAB PRIMER. This is intended as a guided tour through PCMATLAB. Type as you go and watch what happens.

MATH (CRN 13695) Lab 1: Basics for Linear Algebra and Matlab

CS5670: Computer Vision

ENGR 1181 MATLAB 09: For Loops 2

CS231A Section 6: Problem Set 3

Homework Assignment #3

Local Descriptors. CS 510 Lecture #21 April 6 rd 2015

Panoramic Image Stitching

3D Reconstruction From Multiple Views Based on Scale-Invariant Feature Transform. Wenqi Zhu

Automatic Image Alignment (feature-based)

Clustering Images. John Burkardt (ARC/ICAM) Virginia Tech... Math/CS 4414:

Programming Exercise 7: K-means Clustering and Principal Component Analysis

Local Image Features

Programming Exercise 3: Multi-class Classification and Neural Networks

Building a Panorama. Matching features. Matching with Features. How do we build a panorama? Computational Photography, 6.882

Image Features: Local Descriptors. Sanja Fidler CSC420: Intro to Image Understanding 1/ 58

TA Section 7 Problem Set 3. SIFT (Lowe 2004) Shape Context (Belongie et al. 2002) Voxel Coloring (Seitz and Dyer 1999)

Empirical risk minimization (ERM) A first model of learning. The excess risk. Getting a uniform guarantee

Excel Basics Rice Digital Media Commons Guide Written for Microsoft Excel 2010 Windows Edition by Eric Miller

CSE 547: Machine Learning for Big Data Spring Problem Set 2. Please read the homework submission policies.

CS 223B Computer Vision Problem Set 3

10/5/2017 MIST.6060 Business Intelligence and Data Mining 1. Nearest Neighbors. In a p-dimensional space, the Euclidean distance between two records,

CS1114 Assignment 5 Part 1

Programming Exercise 1: Linear Regression

MITOCW ocw f99-lec07_300k

Instance-level recognition

Evaluation and comparison of interest points/regions

A Novel Algorithm for Color Image matching using Wavelet-SIFT

Multi-view 3D reconstruction. Problem formulation Projective ambiguity Rectification Autocalibration Feature points and their matching

CS S Lecture February 13, 2017

Local Features: Detection, Description & Matching

Project Report Number Plate Recognition

CS 429H, Spring 2014 Lab Assignment 9: Code Optimization Assigned: April 25 Due: May 2, 11:59PM

CSE/NEUBEH 528 Homework 0: Introduction to Matlab

CS 105, Fall 2003 Lab Assignment L5: Code Optimization See Calendar for Dates

EOSC 352 MATLAB Review

Textural Features for Image Database Retrieval

4.7 Approximate Integration

Shading Techniques Denbigh Starkey

Assignment: Backgrounding and Optical Flow.

CPSC 67 Lab #5: Clustering Due Thursday, March 19 (8:00 a.m.)

Prof. Noah Snavely CS Administrivia. A4 due on Friday (please sign up for demo slots)

Training-Free, Generic Object Detection Using Locally Adaptive Regression Kernels

The Boundary Graph Supervised Learning Algorithm for Regression and Classification

Introduction to Design Patterns

Dynamics and Vibrations Mupad tutorial

Feature Matching and Robust Fitting

Student: Yu Cheng (Jade) ICS 675 Assignment #1 October 25, # concatenates elem any symbol., ; ;

Linear Algebra Review

Practice Exam Sample Solutions

1. Practice the use of the C ++ repetition constructs of for, while, and do-while. 2. Use computer-generated random numbers.

Transcription:

CS1114 Section: SIFT April 3, 2013 Object recognition has three basic parts: feature extraction, feature matching, and fitting a transformation. In this lab, you ll learn about SIFT feature extraction and feature matching the transformation fitting part will be done for you. To test your functions, we provide two test images, template.png and search.jpg, as well as our own reference implementation. The function stubs and images are in the directory: /courses/cs1114/section/sift/ 1 Feature detection Figure 1: SIFT features detected in an image. In this section you will be using SIFT (Scale-Invariant Feature Transform) to detect features in a template image, and match it to a search image. First, you can try this out in the provided GUI, in the directory above: >> sift_gui You should see an interface like this pop up: 1

In this interface, you load in a template image, which contains the thing you are searching for, and the search image, the image you are searching for the object in. To try this out, click load template, and choose template.png, then click load search, and choose search.png. Next, click look for template in search. After a moment, a red box should outline the book in the search image. What happened behind the scenes? Our code computed SIFT features for both images, matched them, then found a transformation the maps the template image to the search image, drawing a red box accordingly. Next, try this with a slightly harder example by loading in template2.png (the Blackadder DVD cover) as the template image. SIFT is still able to do a good job on this example (SIFT is incredibly powerful). We provide you with a function in Matlab called sift (courtesy of Andreas Veldaldi). The sift function takes in a grayscale image (in double format), and returns two matrices, a set of feature coordinate frames and a set of feature descriptors: >> template = imread( template.png ); >> [frames, descriptors] = sift(im2double(template)); If SIFT detects n features, then frames is a 4 n matrix, and descriptors is a 128 n matrix. Each coordinate frame (column of the frames matrix) describes the 2D position, scale, and orientation of a feature. Each feature descriptor (column of the descriptors matrix) is a 128-dimensional vector describing the local appearance of the corresponding feature. Local features corresponding to the same scene point in different images should have similar (though not identical) feature descriptors, as we talked about in class. Figure?? shows an example of detected SIFT features in an image. Try out the two lines above to compute SIFT features for the template.png image. 2

Next, you can plot the positions of the detected features using the plotsiftframes function. This will just show the SIFT features, so you want to use it as follows: >> imshow(template); >> hold on; >> plotsiftframe(frames); You should now see a bunch of circles showing where SIFT features were detected. 2 Feature matching Figure 2: A pair of images we might wish to match. Your real work begins with the second step in our object recognition pipeline: feature matching. Suppose we are given a reference template image of an object, and a second image in which we want to find the same object; an example pair of images is shown in Figure??. To do so, we need to find pairs of features in the two images that correspond to the same point on the object; these should be features with similar descriptors. Thus, feature matching involves finding features with similar descriptors. We ll use nearest neighbor matching for this task: for each feature in image 1 (the template image), we ll find the most similar feature in image 2 (the search image), that is, the feature with the most similar descriptor. We ll define the distance between two 3

descriptors a and b (two 128-dimensional vectors) using the usual Euclidean distance: distance(a, b) = 128 (a i b i ) Assuming a and b are column vectors, this formula can be written using matrix operator as: distance(a, b) = (a b) T (a b) where T denotes the transpose operator. Your first task is to write a function match nn student to find nearest neighbors. This function will take in two sets of descriptors, descs1 and descs2, and will return a 3 numberofcolumnsindescs1 matrix. The first two rows of this matrix will be pairs of indices of matching features, and the third row will be the distances between the matching feature descriptors. For instance, the first column of this matrix might be [ 1; 27; 1.5 ]; this means that feature 1 in image 1 is closest (in terms of its descriptor) to feature 27 in image 2, and the distance between the descriptors is 1.5. To test, you can use our provided function match nn. You ve seen the max function before, but you may find it useful to use some of its more advanced features. For example, max(a,[],2) will find the max of each row of a 2-D matrix A, and [C, I] = max(...) will return the indices of the maximum values as well as the values themselves. See the documentation for max for details. = Write the function match nn student. To visualize a set of matches, you can use a function we provide to you called plotmatches. This function takes in two images, two sets of frames, and your matches, and draws a figure where you can see the matches. If you provide interactive, 1 as the last two arguments to this function, you will get an interactive window where you can click on features to see their matches. Here s an example call to plotmatches, assuming that our images are stored in variables template and search, and our frames in frames1 and frames2: i=1 >> template = imread( template.png ); >> [frames1, descriptors1] = sift(im2double(template)); >> search = imread( search.png ); >> [frames2, descriptors2] = sift(im2double(search)); >> matches = match_nn_student(descriptors1, descriptors2); >> plotmatches(template, search, frames1, frames2, matches, interactive, 1); This interface allows you to click on a feature in one image and see the corresponding feature in the other image. As you can see, this matching procedure will undoubtedly return many false matches; SIFT is not perfect. One way to reduce the number of false matches is to threshold the matches by distance between descriptors. You will do this 4

by writing a function threshold matches student, which takes in a set of matches (the output of the function match nn student or match nn) and a threshold, and returns a new matrix of matches whose distances are less than the threshold. = Write the function threshold matches student. This function can be written in one line of Matlab code (not including comments). Figure 3: Example of images with repetitive features. Try visualizing the resulting remaining matches using plotmatches. It turns out that the even this thresholding doesn t work that well. Consider the example image pair shown in Figure??. There are a lot of repetitive features in these images, and all of their descriptors will look similar. To find unique, distinctive feature matches, consider the following strategy: for each descriptor a in image 1, find the two nearest neighbors in image 2. Call these b and c, and let their distances from a be distance(a, b) and distance(a, c). If distance(a, b) is much smaller than distance(a, c), then b is a much better match than even the second closest feature. Thresholding using this test will tend to get rid of features with multiple possible matches. To make this concrete, we ll define our new distance function between a and b as the ratio of the distances to the two nearest neighbors. distance(a, b) distance(a, c). We ll call this the ratio distance. You ll now write a function match nn ratio student that does the exact same thing as match nn, except that it returns the ratio distance in the third row of the output matrix. = Write the function match nn ratio student. You ll need to find the top two nearest neighbors in this function. Your function must spend at most O(n) time finding the top two neighbors for each feature in the first image, where n is the number of features in the second image. You can now use your threshold matches student function on the output of this function. Note that the distances are now ratios rather than pure distances, so the units 5

are different (for instance, the ratio distance will be always be between 0 and 1). For the ratio distance, a threshold of 0.6 usually works pretty well. Try visualizing the resulting remaining matches using plotmatches. 6