Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Similar documents
Image Alignment CSC 767

Fitting and Alignment

LEAST SQUARES. RANSAC. HOUGH TRANSFORM.

Model Fitting מבוסס על שיעור שנבנה ע"י טל הסנר

CS 534: Computer Vision Model Fitting

Multi-stable Perception. Necker Cube

Lecture 9 Fitting and Matching

Model Fitting, RANSAC. Jana Kosecka

Fitting. Fitting. Slides S. Lazebnik Harris Corners Pkwy, Charlotte, NC

Fitting: Voting and the Hough Transform

Image warping and stitching May 5 th, 2015

Fitting. We ve learned how to detect edges, corners, blobs. Now what? We would like to form a. compact representation of

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

EECS 442 Computer vision. Fitting methods

Alignment and Object Instance Recognition

Fitting: The Hough transform

Active Contours/Snakes

Lecture 9 Fitting and Matching

Fitting. Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! EECS Fall 2014! Foundations of Computer Vision!

Fitting: Deformable contours April 26 th, 2018

Unsupervised Learning and Clustering

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

LECTURE : MANIFOLD LEARNING

CS 231A Computer Vision Midterm

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Fitting: The Hough transform

CS 231A Computer Vision Midterm

A Robust Method for Estimating the Fundamental Matrix

Structure from Motion

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

Fitting: The Hough transform

Lecture 8 Fitting and Matching

Recognizing Faces. Outline

Object Recognition Based on Photometric Alignment Using Random Sample Consensus

Discriminative classifiers for object classification. Last time

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram

Hough Transform and RANSAC

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

Support Vector Machines

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline

Geometric Transformations and Multiple Views

AIMS Computer vision. AIMS Computer Vision. Outline. Outline.

Feature Reduction and Selection

An efficient method to build panoramic image mosaics

y and the total sum of

12. Segmentation. Computer Engineering, i Sejong University. Dongil Han

Machine Learning. Topic 6: Clustering

Ecient Computation of the Most Probable Motion from Fuzzy. Moshe Ben-Ezra Shmuel Peleg Michael Werman. The Hebrew University of Jerusalem

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

LOOP ANALYSIS. The second systematic technique to determine all currents and voltages in a circuit

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Unsupervised Learning

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

MOTION BLUR ESTIMATION AT CORNERS

Model Fitting: The Hough transform I

Performance Evaluation of Information Retrieval Systems

3D vector computer graphics

Computer Vision Lecture 12

Fuzzy C-Means Initialized by Fixed Threshold Clustering for Improving Image Retrieval

Deterministic Hypothesis Generation for Robust Fitting of Multiple Structures

IMAGE MATCHING WITH SIFT FEATURES A PROBABILISTIC APPROACH

Programming in Fortran 90 : 2017/2018

Prof. Feng Liu. Spring /24/2017

Two Dimensional Projective Point Matching

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

Unsupervised Learning and Clustering

Biostatistics 615/815

Lecture 5: Multilayer Perceptrons

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Robust Computation and Parametrization of Multiple View. Relations. Oxford University, OX1 3PJ. Gaussian).

10/03/11. Model Fitting. Computer Vision CS 143, Brown. James Hays. Slides from Silvio Savarese, Svetlana Lazebnik, and Derek Hoiem

Computer Vision I. Xbox Kinnect: Rectification. The Fundamental matrix. Stereo III. CSE252A Lecture 16. Example: forward motion

What are the camera parameters? Where are the light sources? What is the mapping from radiance to pixel color? Want to solve for 3D geometry

Improved SIFT-Features Matching for Object Recognition

MOTION PANORAMA CONSTRUCTION FROM STREAMING VIDEO FOR POWER- CONSTRAINED MOBILE MULTIMEDIA ENVIRONMENTS XUNYU PAN

6.1 2D and 3D feature-based alignment 275. similarity. Euclidean

Smoothing Spline ANOVA for variable screening

Fi#ng & Matching Region Representa3on Image Alignment, Op3cal Flow

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

Accounting for the Use of Different Length Scale Factors in x, y and z Directions

Calibrating a single camera. Odilon Redon, Cyclops, 1914

Monte Carlo Integration

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

Edge Detection in Noisy Images Using the Support Vector Machines

Line-based Camera Movement Estimation by Using Parallel Lines in Omnidirectional Video

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Generalized, Basis-Independent Kinematic Surface Fitting

S1 Note. Basis functions.

Private Information Retrieval (PIR)

CSE 326: Data Structures Quicksort Comparison Sorting Bound

Radial Basis Functions

Feature Extraction and Registration An Overview

Pose Estimation in Heavy Clutter using a Multi-Flash Camera

3D Modeling Using Multi-View Images. Jinjin Li. A Thesis Presented in Partial Fulfillment of the Requirements for the Degree Master of Science

Transcription:

Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros.

How do we buld panorama? We need to match (algn) mages

Matchng wth Features Detect feature ponts n both mages

Matchng wth Features Detect feature ponts n both mages Fnd correspondng pars

Matchng wth Features Detect feature ponts n both mages Fnd correspondng pars Use these pars to algn mages

Matchng wth Features Detect feature ponts n both mages Fnd correspondng pars Use these pars to algn mages Prevous lecture

Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem

Fttng Choose a parametrc model to represent a set of features smple model: lnes smple model: crcles complcated model: car Source: K. Grauman

Fttng: Issues Case study: Lne detecton Nose n the measured feature locatons Extraneous data: clutter (outlers), multple lnes Mssng data: occlusons Slde: S. Lazebnk

Slde: S. Lazebnk Fttng: Issues If we know whch ponts belong to the lne, how do we fnd the optmal lne parameters? Least squares What f there are outlers? Robust fttng, RANSAC What f there are many lnes? Votng methods: RANSAC, Hough transform What f we re not even sure t s a lne? Model selecton

Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem

Least squares lne fttng Data: (x 1, y 1 ),, (x n, y n ) Lne equaton: y = m x + b Fnd (m, b) to mnmze E n 1 ( y mx b) 2 (x, y ) y=mx+b Slde: S. Lazebnk

Least squares lne fttng Data: (x 1, y 1 ),, (x n, y n ) Lne equaton: y = m x + b Fnd (m, b) to mnmze 0 2 2 Y X XB X db de T T ) ( ) ( ) 2( ) ( ) ( 1 1 1 2 2 1 1 1 2 XB XB Y XB Y Y XB Y XB Y XB Y b m x x y y b m x y E T T T T n n n Normal equatons: least squares soluton to XB=Y n b mx y E 1 2 ) ( (x, y ) y=mx+b Y X XB X T T Slde: S. Lazebnk

Problem wth vertcal least squares Not rotaton-nvarant Fals completely for vertcal lnes Slde: S. Lazebnk

Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem

Slde: S. Lazebnk Total least squares Dstance between pont (x, y ) and lne ax+by=d (a 2 +b 2 =1): ax + by d E n 1 2 ( ax(x by, y d ) ax+by=d Unt normal: N=(a, b)

Total least squares Dstance between pont (x, y ) and lne ax+by=d (a 2 +b 2 =1): ax + by d Fnd (a, b, d) to mnmze the sum of squared perpendcular dstances n E ax by E n 1 ( ax by d) 2 2 ( (x d 1, y ) ax+by=d Unt normal: N=(a, b)

E d E de dn Total least squares Dstance between pont (x, y ) and lne ax+by=d (a 2 +b 2 =1): ax + by d Fnd (a, b, d) to mnmze the sum of squared perpendcular dstances n E ax by E n n 1 1 n 1 2( ax ( a( x ( ax by by x) b( y T 2( U U) N 0 d) d) y)) 0 2 2 x x 1 n d x x a n y y 1 n 2 ( (x d 1, y ) ax+by=d Unt normal: N=(a, b) n b n x x ax by 1 1 n y a b y 2 ( UN) T ( UN) Soluton to (U T U)N = 0, subject to N 2 = 1: egenvector of U T U assocated wth the smallest egenvalue (least squares soluton to homogeneous lnear system UN = 0) Slde: S. Lazebnk

Total least squares y y x x y y x x U n n 1 1 n n n n T y y y y x x y y x x x x U U 1 2 1 1 1 2 ) ( ) )( ( ) )( ( ) ( second moment matrx Slde: S. Lazebnk

Total least squares y y x x y y x x U n n 1 1 n n n n T y y y y x x y y x x x x U U 1 2 1 1 1 2 ) ( ) )( ( ) )( ( ) ( ), ( y x N = (a, b) second moment matrx ), ( y y x x Slde: S. Lazebnk

Slde: S. Lazebnk Least squares: Robustness to nose Least squares ft to the red ponts:

Problem: squared error heavly penalzes outlers Slde: S. Lazebnk Least squares: Robustness to nose Least squares ft wth an outler:

Slde: S. Lazebnk Robust estmators General approach: mnmze r (x, θ) resdual of th pont w.r.t. model parameters θ ρ robust functon wth scale parameter σ r x, ; The robust functon ρ behaves lke squared dstance for small values of the resdual u but saturates for larger values of u

Choosng the scale: Just rght The effect of the outler s mnmzed Slde: S. Lazebnk

Choosng the scale: Too small The error value s almost the same for every pont and the ft s very poor Slde: S. Lazebnk

Choosng the scale: Too large Behaves much the same as least squares

Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem

RANSAC Robust fttng can deal wth a few outlers what f we have very many? Random sample consensus (RANSAC): Very general framework for model fttng n the presence of outlers Outlne Choose a small subset of ponts unformly at random Ft a model to that subset Fnd all remanng ponts that are close to the model and reject the rest as outlers Do ths many tmes and choose the best model M. A. Fschler, R. C. Bolles. Random Sample Consensus: A Paradgm for Model Fttng wth Applcatons to Image Analyss and Automated Cartography. Comm. of the ACM, Vol 24, pp 381-395, 1981. Slde: S. Lazebnk

RANSAC for lne fttng Repeat N tmes: Draw s ponts unformly at random Ft lne to these s ponts Fnd nlers to ths lne among the remanng ponts (.e., ponts whose dstance from the lne s less than t) If there are d or more nlers, accept the lne and reft usng all nlers Source: M. Pollefeys

Choosng the parameters Intal number of ponts s Typcally mnmum number needed to ft the model Dstance threshold t Choose t so probablty for nler s p (e.g. 0.95) Zero-mean Gaussan nose wth std. dev. σ: t 2 =3.84σ 2 Number of samples N Choose N so that, wth probablty p, at least one random sample s free from outlers (e.g. p=0.99) (outler rato: e) Source: M. Pollefeys

Choosng the parameters Intal number of ponts s Typcally mnmum number needed to ft the model Dstance threshold t Choose t so probablty for nler s p (e.g. 0.95) Zero-mean Gaussan nose wth std. dev. σ: t 2 =3.84σ 2 Number of samples N Choose N so that, wth probablty p, at least one random sample s free from outlers (e.g. p=0.99) (outler rato: e) s N 1 1 e 1 p N p/ log1 e log 1 1 s proporton of outlers e s 5% 10% 20% 25% 30% 40% 50% 2 2 3 5 6 7 11 17 3 3 4 7 9 11 19 35 4 3 5 9 13 17 34 72 5 4 6 12 17 26 57 146 6 4 7 16 24 37 97 293 7 4 8 20 33 54 163 588 8 5 9 26 44 78 272 1177 Source: M. Pollefeys

Choosng the parameters Intal number of ponts s Typcally mnmum number needed to ft the model Dstance threshold t Choose t so probablty for nler s p (e.g. 0.95) Zero-mean Gaussan nose wth std. dev. σ: t 2 =3.84σ 2 Number of samples N Choose N so that, wth probablty p, at least one random sample s free from outlers (e.g. p=0.99) (outler rato: e) s N 1 1 e 1 p N p/ log1 e log 1 1 s Source: M. Pollefeys

Choosng the parameters Intal number of ponts s Typcally mnmum number needed to ft the model Dstance threshold t Choose t so probablty for nler s p (e.g. 0.95) Zero-mean Gaussan nose wth std. dev. σ: t 2 =3.84σ 2 Number of samples N Choose N so that, wth probablty p, at least one random sample s free from outlers (e.g. p=0.99) (outler rato: e) Consensus set sze d Should match expected nler rato Source: M. Pollefeys

Adaptvely determnng the number of samples Inler rato e s often unknown a pror, so pck worst case, e.g. 50%, and adapt f more nlers are found, e.g. 80% would yeld e=0.2 Adaptve procedure: N=, sample_count =0 Whle N >sample_count Choose a sample and count the number of nlers Set e = 1 (number of nlers)/(total number of ponts) Recompute N from e: N p/ log1 e log 1 1 s Increment the sample_count by 1 Source: M. Pollefeys

RANSAC pros and cons Pros Smple and general Applcable to many dfferent problems Often works well n practce Cons Lots of parameters to tune Can t always get a good ntalzaton of the model based on the mnmum number of samples Sometmes too many teratons are requred Can fal for extremely low nler ratos We can often do better than brute-force samplng Source: M. Pollefeys

Votng schemes Let each feature vote for all the models that are compatble wth t Hopefully the nose features wll not vote consstently for any sngle model Mssng data doesn t matter as long as there are enough features remanng to agree on a good model

Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem

Hough transform An early type of votng scheme General outlne: Dscretze parameter space nto bns For each feature pont n the mage, put a vote n every bn n the parameter space that could have generated ths pont Fnd bns that have the most votes Image space Hough parameter space P.V.C. Hough, Machne Analyss of Bubble Chamber Pctures, Proc. Int. Conf. Hgh Energy Accelerators and Instrumentaton, 1959

Parameter space representaton A lne n the mage corresponds to a pont n Hough space Image space Hough parameter space Source: S. Setz

Parameter space representaton What does a pont (x 0, y 0 ) n the mage space map to n the Hough space? Image space Hough parameter space Source: S. Setz

Parameter space representaton What does a pont (x 0, y 0 ) n the mage space map to n the Hough space? Answer: the solutons of b = x 0 m + y 0 Ths s a lne n Hough space Image space Hough parameter space Source: S. Setz

Parameter space representaton Where s the lne that contans both (x 0, y 0 ) and (x 1, y 1 )? Image space Hough parameter space (x 1, y 1 ) (x 0, y 0 ) b = x 1 m + y 1 Source: S. Setz

Parameter space representaton Where s the lne that contans both (x 0, y 0 ) and (x 1, y 1 )? It s the ntersecton of the lnes b = x 0 m + y 0 and b = x 1 m + y 1 Image space Hough parameter space (x 1, y 1 ) (x 0, y 0 ) b = x 1 m + y 1 Source: S. Setz

Parameter space representaton Problems wth the (m,b) space: Unbounded parameter doman Vertcal lnes requre nfnte m

Parameter space representaton Problems wth the (m,b) space: Unbounded parameter doman Vertcal lnes requre nfnte m Alternatve: polar representaton xcos y sn Each pont wll add a snusod n the (,) parameter space

Algorthm outlne Intalze accumulator H to all zeros For each edge pont (x,y) ρ n the mage For θ = 0 to 180 ρ = x cos θ + y sn θ H(θ, ρ) = H(θ, ρ) + 1 θ end end Fnd the value(s) of (θ, ρ) where H(θ, ρ) s a local maxmum The detected lne n the mage s gven by ρ = x cos θ + y sn θ

Basc llustraton features votes

Other shapes Square Crcle

Several lnes

A more complcated mage http://ostatc.com/fles/mages/ss_hough.jpg

Effect of nose features votes

Effect of nose features Peak gets fuzzy and hard to locate votes

Effect of nose Number of votes for a lne of 20 ponts wth ncreasng nose:

Random ponts features votes Unform nose can lead to spurous peaks n the array

Random ponts As the level of unform nose ncreases, the maxmum number of votes ncreases too:

Dealng wth nose Choose a good grd / dscretzaton Too coarse: large votes obtaned when too many dfferent lnes correspond to a sngle bucket Too fne: mss lnes because some ponts that are not exactly collnear cast votes for dfferent buckets Increment neghborng bns (smoothng n accumulator array) Try to get rd of rrelevant features Take only edge ponts wth sgnfcant gradent magntude

Hough transform for crcles How many dmensons wll the parameter space have? Gven an orented edge pont, what are all possble bns that t can vote for?

Hough transform for crcles y mage space Hough parameter space r ( x, y) ri ( x, y) (x,y) ( x, y) ri ( x, y) x x y

Generalzed Hough transform We want to fnd a shape defned by ts boundary ponts and a reference pont a D. Ballard, Generalzng the Hough Transform to Detect Arbtrary Shapes, Pattern Recognton 13(2), 1981, pp. 111-122.

Generalzed Hough transform We want to fnd a shape defned by ts boundary ponts and a reference pont For every boundary pont p, we can compute the dsplacement vector r = a p as a functon of gradent orentaton θ a p θ r(θ) D. Ballard, Generalzng the Hough Transform to Detect Arbtrary Shapes, Pattern Recognton 13(2), 1981, pp. 111-122.

Generalzed Hough transform For model shape: construct a table ndexed by θ storng dsplacement vectors r as functon of gradent drecton Detecton: For each edge pont p wth gradent orentaton θ: Retreve all r ndexed wth θ For each r(θ), put a vote n the Hough space at p + r(θ) Peak n ths Hough space s reference pont wth most supportng edges Assumpton: translaton s the only transformaton here,.e., orentaton and scale are fxed Source: K. Grauman

Example model shape

Example dsplacement vectors for model ponts

Example range of votng locatons for test pont

Example range of votng locatons for test pont

Example votes for ponts wth θ =

Example dsplacement vectors for model ponts

Example range of votng locatons for test pont

Example votes for ponts wth θ =

Applcaton n recognton Instead of ndexng dsplacements by gradent orentaton, ndex by vsual codeword tranng mage vsual codeword wth dsplacement vectors B. Lebe, A. Leonards, and B. Schele, Combned Object Categorzaton and Segmentaton wth an Implct Shape Model, ECCV Workshop on Statstcal Learnng n Computer Vson 2004

Applcaton n recognton Instead of ndexng dsplacements by gradent orentaton, ndex by vsual codeword test mage B. Lebe, A. Leonards, and B. Schele, Combned Object Categorzaton and Segmentaton wth an Implct Shape Model, ECCV Workshop on Statstcal Learnng n Computer Vson 2004

Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem

Image algnment Two broad approaches: Drect (pxel-based) algnment Search for algnment where most pxels agree Feature-based algnment Search for algnment where extracted features agree Can be verfed usng pxel-based algnment Source: S. Lazebnk

Algnment as fttng Prevously: fttng a model to features n one mage x M Fnd model M that mnmzes resdual( x, M ) Source: S. Lazebnk

Algnment as fttng Prevously: fttng a model to features n one mage x M Fnd model M that mnmzes resdual( x, M ) Algnment: fttng a model to a transformaton between pars of features (matches) n two mages x T x ' Fnd transformaton T that mnmzes resdual( T ( x ), x ) Source: S. Lazebnk

2D transformaton models Smlarty (translaton, scale, rotaton) Affne Projectve (homography) Source: S. Lazebnk

Let s start wth affne transformatons Smple fttng procedure (lnear least squares) Approxmates vewpont changes for roughly planar objects and roughly orthographc cameras Can be used to ntalze fttng for more complex models Source: S. Lazebnk

Fttng an affne transformaton Assume we know the correspondences, how do we get the transformaton? ), ( y x ), ( x y 2 1 4 3 2 1 t t y x m m m m y x y x t t m m m m y x y x 2 1 4 3 2 1 1 0 0 0 0 1 0 0 Source: S. Lazebnk

Fttng an affne transformaton Lnear system wth sx unknowns Each match gves us two lnearly ndependent equatons: need at least three to solve for the transformaton parameters y x t t m m m m y x y x 2 1 4 3 2 1 1 0 0 0 0 1 0 0 Source: S. Lazebnk

Feature-based algnment outlne

Feature-based algnment outlne Extract features

Feature-based algnment outlne Extract features Compute putatve matches

Feature-based algnment outlne Extract features Compute putatve matches Loop: Hypothesze transformaton T

Feature-based algnment outlne Extract features Compute putatve matches Loop: Hypothesze transformaton T Verfy transformaton (search for other matches consstent wth T)

Feature-based algnment outlne Extract features Compute putatve matches Loop: Hypothesze transformaton T Verfy transformaton (search for other matches consstent wth T)

Dealng wth outlers The set of putatve matches contans a very hgh percentage of outlers Geometrc fttng strateges: RANSAC Hough transform

RANSAC RANSAC loop: 1. Randomly select a seed group of matches 2. Compute transformaton from seed group 3. Fnd nlers to ths transformaton 4. If the number of nlers s suffcently large, re-compute least-squares estmate of transformaton on all of the nlers Keep the transformaton wth the largest number of nlers

RANSAC example: Translaton Putatve matches Source: A. Efros

RANSAC example: Translaton Select one match, count nlers Source: A. Efros

RANSAC example: Translaton Select one match, count nlers Source: A. Efros

RANSAC example: Translaton Select translaton wth the most nlers Source: A. Efros