Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros.
How do we buld panorama? We need to match (algn) mages
Matchng wth Features Detect feature ponts n both mages
Matchng wth Features Detect feature ponts n both mages Fnd correspondng pars
Matchng wth Features Detect feature ponts n both mages Fnd correspondng pars Use these pars to algn mages
Matchng wth Features Detect feature ponts n both mages Fnd correspondng pars Use these pars to algn mages Prevous lecture
Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem
Fttng Choose a parametrc model to represent a set of features smple model: lnes smple model: crcles complcated model: car Source: K. Grauman
Fttng: Issues Case study: Lne detecton Nose n the measured feature locatons Extraneous data: clutter (outlers), multple lnes Mssng data: occlusons Slde: S. Lazebnk
Slde: S. Lazebnk Fttng: Issues If we know whch ponts belong to the lne, how do we fnd the optmal lne parameters? Least squares What f there are outlers? Robust fttng, RANSAC What f there are many lnes? Votng methods: RANSAC, Hough transform What f we re not even sure t s a lne? Model selecton
Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem
Least squares lne fttng Data: (x 1, y 1 ),, (x n, y n ) Lne equaton: y = m x + b Fnd (m, b) to mnmze E n 1 ( y mx b) 2 (x, y ) y=mx+b Slde: S. Lazebnk
Least squares lne fttng Data: (x 1, y 1 ),, (x n, y n ) Lne equaton: y = m x + b Fnd (m, b) to mnmze 0 2 2 Y X XB X db de T T ) ( ) ( ) 2( ) ( ) ( 1 1 1 2 2 1 1 1 2 XB XB Y XB Y Y XB Y XB Y XB Y b m x x y y b m x y E T T T T n n n Normal equatons: least squares soluton to XB=Y n b mx y E 1 2 ) ( (x, y ) y=mx+b Y X XB X T T Slde: S. Lazebnk
Problem wth vertcal least squares Not rotaton-nvarant Fals completely for vertcal lnes Slde: S. Lazebnk
Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem
Slde: S. Lazebnk Total least squares Dstance between pont (x, y ) and lne ax+by=d (a 2 +b 2 =1): ax + by d E n 1 2 ( ax(x by, y d ) ax+by=d Unt normal: N=(a, b)
Total least squares Dstance between pont (x, y ) and lne ax+by=d (a 2 +b 2 =1): ax + by d Fnd (a, b, d) to mnmze the sum of squared perpendcular dstances n E ax by E n 1 ( ax by d) 2 2 ( (x d 1, y ) ax+by=d Unt normal: N=(a, b)
E d E de dn Total least squares Dstance between pont (x, y ) and lne ax+by=d (a 2 +b 2 =1): ax + by d Fnd (a, b, d) to mnmze the sum of squared perpendcular dstances n E ax by E n n 1 1 n 1 2( ax ( a( x ( ax by by x) b( y T 2( U U) N 0 d) d) y)) 0 2 2 x x 1 n d x x a n y y 1 n 2 ( (x d 1, y ) ax+by=d Unt normal: N=(a, b) n b n x x ax by 1 1 n y a b y 2 ( UN) T ( UN) Soluton to (U T U)N = 0, subject to N 2 = 1: egenvector of U T U assocated wth the smallest egenvalue (least squares soluton to homogeneous lnear system UN = 0) Slde: S. Lazebnk
Total least squares y y x x y y x x U n n 1 1 n n n n T y y y y x x y y x x x x U U 1 2 1 1 1 2 ) ( ) )( ( ) )( ( ) ( second moment matrx Slde: S. Lazebnk
Total least squares y y x x y y x x U n n 1 1 n n n n T y y y y x x y y x x x x U U 1 2 1 1 1 2 ) ( ) )( ( ) )( ( ) ( ), ( y x N = (a, b) second moment matrx ), ( y y x x Slde: S. Lazebnk
Slde: S. Lazebnk Least squares: Robustness to nose Least squares ft to the red ponts:
Problem: squared error heavly penalzes outlers Slde: S. Lazebnk Least squares: Robustness to nose Least squares ft wth an outler:
Slde: S. Lazebnk Robust estmators General approach: mnmze r (x, θ) resdual of th pont w.r.t. model parameters θ ρ robust functon wth scale parameter σ r x, ; The robust functon ρ behaves lke squared dstance for small values of the resdual u but saturates for larger values of u
Choosng the scale: Just rght The effect of the outler s mnmzed Slde: S. Lazebnk
Choosng the scale: Too small The error value s almost the same for every pont and the ft s very poor Slde: S. Lazebnk
Choosng the scale: Too large Behaves much the same as least squares
Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem
RANSAC Robust fttng can deal wth a few outlers what f we have very many? Random sample consensus (RANSAC): Very general framework for model fttng n the presence of outlers Outlne Choose a small subset of ponts unformly at random Ft a model to that subset Fnd all remanng ponts that are close to the model and reject the rest as outlers Do ths many tmes and choose the best model M. A. Fschler, R. C. Bolles. Random Sample Consensus: A Paradgm for Model Fttng wth Applcatons to Image Analyss and Automated Cartography. Comm. of the ACM, Vol 24, pp 381-395, 1981. Slde: S. Lazebnk
RANSAC for lne fttng Repeat N tmes: Draw s ponts unformly at random Ft lne to these s ponts Fnd nlers to ths lne among the remanng ponts (.e., ponts whose dstance from the lne s less than t) If there are d or more nlers, accept the lne and reft usng all nlers Source: M. Pollefeys
Choosng the parameters Intal number of ponts s Typcally mnmum number needed to ft the model Dstance threshold t Choose t so probablty for nler s p (e.g. 0.95) Zero-mean Gaussan nose wth std. dev. σ: t 2 =3.84σ 2 Number of samples N Choose N so that, wth probablty p, at least one random sample s free from outlers (e.g. p=0.99) (outler rato: e) Source: M. Pollefeys
Choosng the parameters Intal number of ponts s Typcally mnmum number needed to ft the model Dstance threshold t Choose t so probablty for nler s p (e.g. 0.95) Zero-mean Gaussan nose wth std. dev. σ: t 2 =3.84σ 2 Number of samples N Choose N so that, wth probablty p, at least one random sample s free from outlers (e.g. p=0.99) (outler rato: e) s N 1 1 e 1 p N p/ log1 e log 1 1 s proporton of outlers e s 5% 10% 20% 25% 30% 40% 50% 2 2 3 5 6 7 11 17 3 3 4 7 9 11 19 35 4 3 5 9 13 17 34 72 5 4 6 12 17 26 57 146 6 4 7 16 24 37 97 293 7 4 8 20 33 54 163 588 8 5 9 26 44 78 272 1177 Source: M. Pollefeys
Choosng the parameters Intal number of ponts s Typcally mnmum number needed to ft the model Dstance threshold t Choose t so probablty for nler s p (e.g. 0.95) Zero-mean Gaussan nose wth std. dev. σ: t 2 =3.84σ 2 Number of samples N Choose N so that, wth probablty p, at least one random sample s free from outlers (e.g. p=0.99) (outler rato: e) s N 1 1 e 1 p N p/ log1 e log 1 1 s Source: M. Pollefeys
Choosng the parameters Intal number of ponts s Typcally mnmum number needed to ft the model Dstance threshold t Choose t so probablty for nler s p (e.g. 0.95) Zero-mean Gaussan nose wth std. dev. σ: t 2 =3.84σ 2 Number of samples N Choose N so that, wth probablty p, at least one random sample s free from outlers (e.g. p=0.99) (outler rato: e) Consensus set sze d Should match expected nler rato Source: M. Pollefeys
Adaptvely determnng the number of samples Inler rato e s often unknown a pror, so pck worst case, e.g. 50%, and adapt f more nlers are found, e.g. 80% would yeld e=0.2 Adaptve procedure: N=, sample_count =0 Whle N >sample_count Choose a sample and count the number of nlers Set e = 1 (number of nlers)/(total number of ponts) Recompute N from e: N p/ log1 e log 1 1 s Increment the sample_count by 1 Source: M. Pollefeys
RANSAC pros and cons Pros Smple and general Applcable to many dfferent problems Often works well n practce Cons Lots of parameters to tune Can t always get a good ntalzaton of the model based on the mnmum number of samples Sometmes too many teratons are requred Can fal for extremely low nler ratos We can often do better than brute-force samplng Source: M. Pollefeys
Votng schemes Let each feature vote for all the models that are compatble wth t Hopefully the nose features wll not vote consstently for any sngle model Mssng data doesn t matter as long as there are enough features remanng to agree on a good model
Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem
Hough transform An early type of votng scheme General outlne: Dscretze parameter space nto bns For each feature pont n the mage, put a vote n every bn n the parameter space that could have generated ths pont Fnd bns that have the most votes Image space Hough parameter space P.V.C. Hough, Machne Analyss of Bubble Chamber Pctures, Proc. Int. Conf. Hgh Energy Accelerators and Instrumentaton, 1959
Parameter space representaton A lne n the mage corresponds to a pont n Hough space Image space Hough parameter space Source: S. Setz
Parameter space representaton What does a pont (x 0, y 0 ) n the mage space map to n the Hough space? Image space Hough parameter space Source: S. Setz
Parameter space representaton What does a pont (x 0, y 0 ) n the mage space map to n the Hough space? Answer: the solutons of b = x 0 m + y 0 Ths s a lne n Hough space Image space Hough parameter space Source: S. Setz
Parameter space representaton Where s the lne that contans both (x 0, y 0 ) and (x 1, y 1 )? Image space Hough parameter space (x 1, y 1 ) (x 0, y 0 ) b = x 1 m + y 1 Source: S. Setz
Parameter space representaton Where s the lne that contans both (x 0, y 0 ) and (x 1, y 1 )? It s the ntersecton of the lnes b = x 0 m + y 0 and b = x 1 m + y 1 Image space Hough parameter space (x 1, y 1 ) (x 0, y 0 ) b = x 1 m + y 1 Source: S. Setz
Parameter space representaton Problems wth the (m,b) space: Unbounded parameter doman Vertcal lnes requre nfnte m
Parameter space representaton Problems wth the (m,b) space: Unbounded parameter doman Vertcal lnes requre nfnte m Alternatve: polar representaton xcos y sn Each pont wll add a snusod n the (,) parameter space
Algorthm outlne Intalze accumulator H to all zeros For each edge pont (x,y) ρ n the mage For θ = 0 to 180 ρ = x cos θ + y sn θ H(θ, ρ) = H(θ, ρ) + 1 θ end end Fnd the value(s) of (θ, ρ) where H(θ, ρ) s a local maxmum The detected lne n the mage s gven by ρ = x cos θ + y sn θ
Basc llustraton features votes
Other shapes Square Crcle
Several lnes
A more complcated mage http://ostatc.com/fles/mages/ss_hough.jpg
Effect of nose features votes
Effect of nose features Peak gets fuzzy and hard to locate votes
Effect of nose Number of votes for a lne of 20 ponts wth ncreasng nose:
Random ponts features votes Unform nose can lead to spurous peaks n the array
Random ponts As the level of unform nose ncreases, the maxmum number of votes ncreases too:
Dealng wth nose Choose a good grd / dscretzaton Too coarse: large votes obtaned when too many dfferent lnes correspond to a sngle bucket Too fne: mss lnes because some ponts that are not exactly collnear cast votes for dfferent buckets Increment neghborng bns (smoothng n accumulator array) Try to get rd of rrelevant features Take only edge ponts wth sgnfcant gradent magntude
Hough transform for crcles How many dmensons wll the parameter space have? Gven an orented edge pont, what are all possble bns that t can vote for?
Hough transform for crcles y mage space Hough parameter space r ( x, y) ri ( x, y) (x,y) ( x, y) ri ( x, y) x x y
Generalzed Hough transform We want to fnd a shape defned by ts boundary ponts and a reference pont a D. Ballard, Generalzng the Hough Transform to Detect Arbtrary Shapes, Pattern Recognton 13(2), 1981, pp. 111-122.
Generalzed Hough transform We want to fnd a shape defned by ts boundary ponts and a reference pont For every boundary pont p, we can compute the dsplacement vector r = a p as a functon of gradent orentaton θ a p θ r(θ) D. Ballard, Generalzng the Hough Transform to Detect Arbtrary Shapes, Pattern Recognton 13(2), 1981, pp. 111-122.
Generalzed Hough transform For model shape: construct a table ndexed by θ storng dsplacement vectors r as functon of gradent drecton Detecton: For each edge pont p wth gradent orentaton θ: Retreve all r ndexed wth θ For each r(θ), put a vote n the Hough space at p + r(θ) Peak n ths Hough space s reference pont wth most supportng edges Assumpton: translaton s the only transformaton here,.e., orentaton and scale are fxed Source: K. Grauman
Example model shape
Example dsplacement vectors for model ponts
Example range of votng locatons for test pont
Example range of votng locatons for test pont
Example votes for ponts wth θ =
Example dsplacement vectors for model ponts
Example range of votng locatons for test pont
Example votes for ponts wth θ =
Applcaton n recognton Instead of ndexng dsplacements by gradent orentaton, ndex by vsual codeword tranng mage vsual codeword wth dsplacement vectors B. Lebe, A. Leonards, and B. Schele, Combned Object Categorzaton and Segmentaton wth an Implct Shape Model, ECCV Workshop on Statstcal Learnng n Computer Vson 2004
Applcaton n recognton Instead of ndexng dsplacements by gradent orentaton, ndex by vsual codeword test mage B. Lebe, A. Leonards, and B. Schele, Combned Object Categorzaton and Segmentaton wth an Implct Shape Model, ECCV Workshop on Statstcal Learnng n Computer Vson 2004
Overvew Fttng technques Least Squares Total Least Squares RANSAC Hough Votng Algnment as a fttng problem
Image algnment Two broad approaches: Drect (pxel-based) algnment Search for algnment where most pxels agree Feature-based algnment Search for algnment where extracted features agree Can be verfed usng pxel-based algnment Source: S. Lazebnk
Algnment as fttng Prevously: fttng a model to features n one mage x M Fnd model M that mnmzes resdual( x, M ) Source: S. Lazebnk
Algnment as fttng Prevously: fttng a model to features n one mage x M Fnd model M that mnmzes resdual( x, M ) Algnment: fttng a model to a transformaton between pars of features (matches) n two mages x T x ' Fnd transformaton T that mnmzes resdual( T ( x ), x ) Source: S. Lazebnk
2D transformaton models Smlarty (translaton, scale, rotaton) Affne Projectve (homography) Source: S. Lazebnk
Let s start wth affne transformatons Smple fttng procedure (lnear least squares) Approxmates vewpont changes for roughly planar objects and roughly orthographc cameras Can be used to ntalze fttng for more complex models Source: S. Lazebnk
Fttng an affne transformaton Assume we know the correspondences, how do we get the transformaton? ), ( y x ), ( x y 2 1 4 3 2 1 t t y x m m m m y x y x t t m m m m y x y x 2 1 4 3 2 1 1 0 0 0 0 1 0 0 Source: S. Lazebnk
Fttng an affne transformaton Lnear system wth sx unknowns Each match gves us two lnearly ndependent equatons: need at least three to solve for the transformaton parameters y x t t m m m m y x y x 2 1 4 3 2 1 1 0 0 0 0 1 0 0 Source: S. Lazebnk
Feature-based algnment outlne
Feature-based algnment outlne Extract features
Feature-based algnment outlne Extract features Compute putatve matches
Feature-based algnment outlne Extract features Compute putatve matches Loop: Hypothesze transformaton T
Feature-based algnment outlne Extract features Compute putatve matches Loop: Hypothesze transformaton T Verfy transformaton (search for other matches consstent wth T)
Feature-based algnment outlne Extract features Compute putatve matches Loop: Hypothesze transformaton T Verfy transformaton (search for other matches consstent wth T)
Dealng wth outlers The set of putatve matches contans a very hgh percentage of outlers Geometrc fttng strateges: RANSAC Hough transform
RANSAC RANSAC loop: 1. Randomly select a seed group of matches 2. Compute transformaton from seed group 3. Fnd nlers to ths transformaton 4. If the number of nlers s suffcently large, re-compute least-squares estmate of transformaton on all of the nlers Keep the transformaton wth the largest number of nlers
RANSAC example: Translaton Putatve matches Source: A. Efros
RANSAC example: Translaton Select one match, count nlers Source: A. Efros
RANSAC example: Translaton Select one match, count nlers Source: A. Efros
RANSAC example: Translaton Select translaton wth the most nlers Source: A. Efros