LEAST SQUARES. RANSAC. HOUGH TRANSFORM.

Similar documents
Multi-stable Perception. Necker Cube

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Lecture 9 Fitting and Matching

CS 534: Computer Vision Model Fitting

Image warping and stitching May 5 th, 2015

Fitting. We ve learned how to detect edges, corners, blobs. Now what? We would like to form a. compact representation of

Image Alignment CSC 767

Fitting and Alignment

Calibrating a single camera. Odilon Redon, Cyclops, 1914

Alignment and Object Instance Recognition

Structure from Motion

Model Fitting מבוסס על שיעור שנבנה ע"י טל הסנר

2D Raster Graphics. Integer grid Sequential (left-right, top-down) scan. Computer Graphics

Exact solution, the Direct Linear Transfo. ct solution, the Direct Linear Transform

Lecture 9 Fitting and Matching

EECS 442 Computer vision. Fitting methods

Lecture 4: Principal components

Machine Learning: Algorithms and Applications

A Robust Method for Estimating the Fundamental Matrix

Active Contours/Snakes

Fitting. Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! EECS Fall 2014! Foundations of Computer Vision!

Recognizing Faces. Outline

Feature Reduction and Selection

Geometric Transformations and Multiple Views

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Fitting: Deformable contours April 26 th, 2018

Robust Computation and Parametrization of Multiple View. Relations. Oxford University, OX1 3PJ. Gaussian).

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

AIMS Computer vision. AIMS Computer Vision. Outline. Outline.

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Biostatistics 615/815

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Photo by Carl Warner

Lecture 8 Fitting and Matching

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Support Vector Machines

Computer Vision Lecture 12

Fitting: Voting and the Hough Transform

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009.

What are the camera parameters? Where are the light sources? What is the mapping from radiance to pixel color? Want to solve for 3D geometry

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

Multi-stable Perception. Necker Cube

Radial Basis Functions

Machine Learning 9. week

Data Mining: Model Evaluation

y and the total sum of

LECTURE : MANIFOLD LEARNING

Object Recognition Based on Photometric Alignment Using Random Sample Consensus

10/03/11. Model Fitting. Computer Vision CS 143, Brown. James Hays. Slides from Silvio Savarese, Svetlana Lazebnik, and Derek Hoiem

Video Object Tracking Based On Extended Active Shape Models With Color Information

CS 231A Computer Vision Midterm

Programming in Fortran 90 : 2017/2018

Multi-view 3D Position Estimation of Sports Players

A Novel Accurate Algorithm to Ellipse Fitting for Iris Boundary Using Most Iris Edges. Mohammad Reza Mohammadi 1, Abolghasem Raie 2

A Comparison and Evaluation of Three Different Pose Estimation Algorithms In Detecting Low Texture Manufactured Objects

Deterministic Hypothesis Generation for Robust Fitting of Multiple Structures

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Algorithm To Convert A Decimal To A Fraction

IMPROVING AND EXTENDING THE INFORMATION ON PRINCIPAL COMPONENT ANALYSIS FOR LOCAL NEIGHBORHOODS IN 3D POINT CLOUDS

APPLIED MACHINE LEARNING

Hierarchical clustering for gene expression data analysis

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

Unsupervised Learning and Clustering

Accounting for the Use of Different Length Scale Factors in x, y and z Directions

Model Fitting, RANSAC. Jana Kosecka

Journal of Terrestrial Observation

Vanishing Hull. Jinhui Hu, Suya You, Ulrich Neumann University of Southern California {jinhuihu,suyay,

Prof. Feng Liu. Winter /24/2019

A SYSTOLIC APPROACH TO LOOP PARTITIONING AND MAPPING INTO FIXED SIZE DISTRIBUTED MEMORY ARCHITECTURES

CS 231A Computer Vision Midterm

Quick error verification of portable coordinate measuring arm

MOTION PANORAMA CONSTRUCTION FROM STREAMING VIDEO FOR POWER- CONSTRAINED MOBILE MULTIMEDIA ENVIRONMENTS XUNYU PAN

K-means and Hierarchical Clustering

The Research of Ellipse Parameter Fitting Algorithm of Ultrasonic Imaging Logging in the Casing Hole

Some Tutorial about the Project. Computer Graphics

INF Repetition Anne Solberg INF

Machine Learning. Topic 6: Clustering

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

Motivation. Matching, Alignment, and Registration. Components in Matching. Image Alignment

Lecture 5: Multilayer Perceptrons

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Fitting. Fitting. Slides S. Lazebnik Harris Corners Pkwy, Charlotte, NC

X- Chart Using ANOM Approach

Prof. Feng Liu. Spring /24/2017

Angle-Independent 3D Reconstruction. Ji Zhang Mireille Boutin Daniel Aliaga

Introduction to Geometrical Optics - a 2D ray tracing Excel model for spherical mirrors - Part 2

Support Vector Machines

An efficient method to build panoramic image mosaics

A high precision collaborative vision measurement of gear chamfering profile

ECE Digital Image Processing and Introduction to Computer Vision

Harmonic Coordinates for Character Articulation PIXAR

Wishing you all a Total Quality New Year!

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

Comparison of traveltime inversions on a limestone structure

Announcements. Supervised Learning

A Scalable Projective Bundle Adjustment Algorithm using the L Norm

USING GRAPHING SKILLS

LESSON 15: BODE PLOTS OF TRANSFER FUNCTIONS

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

Transcription:

LEAS SQUARES. RANSAC. HOUGH RANSFORM. he sldes are from several sources through James Has (Brown); Srnvasa Narasmhan (CMU); Slvo Savarese (U. of Mchgan); Bll Freeman and Antono orralba (MI), ncludng ther own sldes.

Homogeneous coordnates Converson (see also lecture 2) Convertng to homogeneous coordnates homogeneous mage coordnates homogeneous scene coordnates Convertng from homogeneous coordnates to Cartesan coordnates

Homogeneous coordnates are nvarant to scalng. k w k k kw Homogeneous Coordnates k kw k kw w w Cartesan Coordnates Pont n Cartesan coordnates s ra n homogeneous coordnates.

Lnes n a 2D plane 0 c b a -c/b c b a l If = [ 1, 2 ] lne c b a 1 2 1 lne slope -a/b nonhomogeneous (Cartesan) coordnates what ou measure homogeneous... 0 ntercept

he pont s the cross-product of two ntersectng lnes. l l lne l Proof l l l l l l 0 l ) l l ( 0 l ) l l ( l l s the ntersecton pont n homogeneous coordnates also l = X_1 X_2

Ponts at nfnt (deal ponts) 0, 3 3 2 1 c b a l c b a l 0 a b ) c (c l l Intersecton of two parallel lnes s a pont at nfnt. l [b -a 0] = 0 lne slope -a/b lne l 0 2 1 deal pont

Lnes nfnt l Set of deal ponts les on a lne called the lne at nfnt. A lne has a value at the thrd element onl. l 1 0 0 l 2 1 0 0 1 0 0 snce l_nf = X_1nf X_2nf

Fttng Crtcal ssues: nos data; outlers; mssng data etc.

Crtcal ssues: nos data

Crtcal ssues: ntra-class varablt A

Crtcal ssues: outlers

Crtcal ssues: mssng data (occlusons)

Ordnar least squares lne fttng Data: ( 1, 1 ),, ( n, n ) Lne equaton: = m + b Fnd (m,, b) ) to mnmze E Y E = n = 1 ( m b) 1 1 1 = X = M M M n n 1 = Y XB 2 = ( Y XB) 2 ( Y m B = b XB) = Y Y 2( XB) (, ) Y + ( XB) =m+b vertcal resduals onl! ( XB) de db X = 2 X XB 2X Y = XB = X Y 0 Normal equatons: least squares soluton to XB=Y

Ordnar least squares method fttng a lne E n 1 ( m b 2 ) =m+b B 1 X X X Y Lmtatons snce the nose n s neglected. m B b (, ) E. Fals completel for deal vertcal lnes. Sa, =1. he X s sngular and B s not completel defned. Gves a pont (1,).

otal least squares Dstance between pont (, ) and lne a+b=d (a 2 +b 2 =1): a + b d Wll be '-d' nstead of 'd'. E = n = 1 a+b=d Unt normal: 2 ( a( + b, d ) N=(a, b) perpendcular resduals Both and corrupted wth nose.

otal least squares Dstance between pont (, ) and lne a+b=d (a 2 +b 2 =1): a + b d Fnd (a, b, d) to mnmze the sum of squared perpendcular dstances n E = = a + b E = n = 1 ( a + b d 2 ) 2 ( ( d 1, ) a+b=d Unt normal: N=(a, b)

otal least squares Dstance between pont (, ) and lne a+b=d (a 2 +b 2 =1): a + b d Fnd (a, b, d) to mnmze the sum of squared perpendcular dstances n E = = a + b E = n = 1 ( a + b E = n 2( + ) = 0 1 = a b d d d 1 1 n 2 E = ( a( ) b( )) = + 1 = M M de n n = 2 ( U U ) N = 0 dn d 2 ) 2 ( ( d 1, ) a+b=d Unt normal: N=(a, b) a n b n = a b + = + = 1 = 1 n n a b = 2 ( UN) ( UN) Soluton to (U U)N =0, subject to N 2 =1: egenvector of U U assocated wth the smallest egenvalue (least squares soluton to homogeneous lnear sstem UN = 0)

he matr U U s a covarance matr. herefore, the egenvectors of U U are also the sngular vectors of V from the sngular value decomposon, UDV of U(!). Be aware the two U-s are dfferent... ths s when I cannot modf some of the sldes. See lecture 2 too. In the case of a 2D lne, the soluton s v. 2 he parameter d s (the mean of and ) * v he ft goes through the centrod of the data. 2

otal least squares = U M M 1 1 = = = n n n n U U 2 1 1 2 ) )( ( ) ( n n = = 1 2 1 ) ( ) )( ( second moment matr second moment matr

otal least squares = U M M 1 1 = = = n n n n U U 2 1 1 2 ) )( ( ) ( n n = = 1 2 1 ) ( ) )( ( second moment matr N = (a b) second moment matr ( ) N (a, b) ), ( ), (

LS robustness to nler nose... nlers

...but not to outler nose. Bad scale parameter σ (too large!) outler Squared error alwas takes nto account all nlers and outlers. Least square s not robust to outlers.

M. A. Fschler, R. C. Bolles. Random Sample Consensus: A Paradgm for Model Fttng wth Applcatons to Image Analss and Automated Cartograph. Comm. of the ACM, Vol 24, pp 381-395, 1981. RANdom SAmple Consensus Select one match, count nlers Repeat man tmes. Keep match wth largest set of nlers based on a standard devaton gven b the user.

Basc phlosoph: the votng scheme for almost an elemental subset based estmaton. Elemental subset (mnmum number of ponts) randoml pcked up for each hphotess. he standard devaton of the nler nose has to be gven before b the user. Assumpton1: Outler features wll not vote consstentl for an sngle model. Assumpton 2: here are enough features to agree on a good model.

RANSAC Sample set = set of ponts n 2D sgma s gven Algorthm: 1. Select random sample of mnmum requred sze to ft model 2. Compute a putatve model from sample set 3. Compute the set of nlers to ths model from whole data set Repeat 1-3 untl model wth the most nlers over all samples s found

RANSAC for lne fttng eample Inlers and outlers. Sgma has to be gven at the begnnng. Source: R. Raguram

RANSAC for lne fttng eample Least squares ft Source: R. Raguram

RANSAC for lne fttng eample 1. Randoml select mnmal subset of ponts (= 2). Source: R. Raguram

RANSAC for lne fttng eample 1. Randoml select mnmal subset of ponts 2. Hpothesze a model Source: R. Raguram

RANSAC for lne fttng eample 1. Randoml select mnmal subset of ponts 2. Hpothesze a model 3. Compute error functon Source: R. Raguram

RANSAC for lne fttng eample 1. Randoml select mnmal subset of ponts 2. Hpothesze a model 3. Compute error functon 4. Select ponts consstent wth model (sgma) Source: R. Raguram

RANSAC for lne fttng eample 1. Randoml select mnmal subset of ponts 2. Hpothesze a model 3. Compute error functon 4. Select ponts consstent wth model 5. Repeat hpothesze and verf loop Source: R. Raguram

RANSAC for lne fttng eample 1. Randoml select mnmal subset of ponts 2. Hpothesze a model 3. Compute error functon 4. Select ponts consstent wth model 5. Repeat hpothesze and verf loop Source: R. Raguram

RANSAC for lne fttng eample 1. Randoml select mnmal subset of ponts 2. Hpothesze a model 3. Compute error functon 4. Select ponts consstent wth model 5. Repeat hpothesze and verf loop Source: R. Raguram

RANSAC for lne fttng eample he best nler structure largest number of nlers. 1. Randoml select mnmal subset of ponts 2. Hpothesze a model 3. Compute error functon 4. Select ponts consstent wth model 5. Repeat hpothesze and verf loop Source: R. Raguram Do least-square ft on all the nlers.

RANSAC RANdom SAmple Consensus...n general. Algorthm: 1. Select random sample of mnmum requred sze to ft model 2. Compute a putatve model from sample set 3. Compute the set of nlers to ths model from whole data set Repeat 1-3 untl model wth the most nlers over all samples s found

RANSAC Algorthm: 1. Select random sample of mnmum requred sze to ft model 2. Compute a putatve model from sample set 3. Compute the set of nlers to ths model from whole data set Repeat 1-3 untl model wth the most nlers over all samples s found

RANSAC standard devaton of the nler nose has to be gven Algorthm: O = 6 1. Select random sample of mnmum requred sze to ft model 2. Compute a putatve model from sample set 3. Compute the set of nlers to ths model from whole data set Repeat 1-3 untl model wth the most nlers over all samples s found

RANSAC Algorthm: O = 14 1. Select random sample of mnmum requred sze to ft model 2. Compute a putatve model from sample set 3. Compute the set of nlers to ths model from whole data set Repeat 1-3 untl model wth the most nlers over all samples s found

Eample: Image 1 Image 2 Matches: Red: good matches Green: bad matches RANSAC ft a homograph (later lecture) mappng SIF features from mage 1 to 2. Majort of bad matches wll be labeled as outlers.

hs s a robust ft... as we also sad n lecture on SIF.

RANSAC - conclusons a better robust estmator est alread Good Robust to outlers f there are not too man. he number of hphotess N s taken suffcentl large (hundreds to thousands) that RANSAC gves ver smlar results ever tme. Bad Computatonal tme grows quckl wth fracton of outlers and number of parameters. Not good for gettng multple nler structures. Some applcatons Computng a homograph (e.g., mage sttchng) Estmatng fundamental matr (relatng two vews), etc.

Hough transform P.V.C. Hough, Machne Analss of Bubble Chamber Pctures, Proc. Int. Conf. Hgh Energ Accelerators and Instrumentaton, 1959. Gven a set of ponts, fnd the lne or curve that eplans the data ponts best. Appled to lnes, crcles and sometme ellpses.

Hough transform - for lnes Use a polar representaton for the parameter space. Each pont wll add a snusod n the (θ,ρ) parameter space. he (two) dmensons have dfferent thresholds. Mstakes can gve nonestng features. cos 0 2p Hough space sn

In theor... feature votes

...but the effect of nose s ver mportant. feature Peak gets fuzz and hard to locate. votes

Spurous peaks due to unform nose. features votes

A nce eample. In general not so......because mstakes n labelng appear. Lnes: 5-- and 5 http://ostatc.com/fles/mages/ss_hough.jpg

Hough transform - conclusons Good: All ponts are processed ndependentl, so can cope wth occluson/outlers. Some robustness to nose: nose ponts unlkel to contrbute consstentl to an sngle bn. Bad: Spurous peaks due to unform nose. rade-off nose vs. grd sze. Hard to fnd sweet ponts for each threshold when multple features are detected. Is used n the ndustr for repeated processng onl.