Lecture 9 Fitting and Matching

Size: px
Start display at page:

Download "Lecture 9 Fitting and Matching"

Transcription

1 In ths lecture, we re gong to talk about a number of problems related to fttng and matchng. We wll formulate these problems formally and our dscusson wll nvolve Least Squares methods, RANSAC and Hough votng. We wll conclude the lecture wth a few remarks on how fttng can be often used to solve a matchng problem. Lecture 9 Fttng and Matchng Problem formulaton Least square methods RANSAC Hough transforms Mult- model fttng Fttng helps matchng! Readng: [HZ] Chapter: 4 Estmaton D projectve transformaton Chapter: 11 Computaton of the fundamental matrx F [FP] Chapter:10 Groupng and model fttng Some sldes of ths lectures are courtesy of profs. S. Lazebnk & K. Grauman Lecture 8 - Slvo Savarese 4-Feb-15 Fttng Goals: Choose a parametrc model to ft a certan quantty from data Estmate model parameters Fttng s about tryng to ft observed data nto a parametrc model that we are assumng holds. The process of fttng such model to the data nvolves estmatng the parameters that descrbe the model such that the fttng error s mnmzed. A classc example s lne fttng: gven a set of ponts n D, the goal s to fnd the parameters that descrbe the lne so as to best ft the set of D ponts. Smlar problems can be defned for other geometrcal quanttes such as curves, homographc transformatons, fundamental matrces or even object shapes. - Lnes - Curves - Homographc transformaton - Fundamental matrx - Shape model Example: fttng lnes (for computng vanshng ponts) The next couple of sldes show some examples of models we could be tryng to ft our data to. In ths example, the goal s ft a lne gven a number of pont observatons (red dots). Fttng lnes n an mage can be useful, for nstance, to estmate the scene vanshng ponts.

2 Example: Estmatng an homographc transformaton Here s another example, where we have ponts (red dots) that le on the same planar regon (the façade of the buldng) and that are observed from two dfferent vews. Because they le on the same planar regons, we expect them to be related by a homographc transformaton. So the goal here would be to ft (and estmate) such a homographc transformaton, gven the set of correspondng ponts. H Example: Estmatng F In ths example, the goal s ft a fundamental matrx transformaton F that relates (at least) 7 pont correspondences. Example: fttng a D shape template One more example s fttng a shape template (the letter A on the top-rght) wth an observaton (mage) of the letter A. A

3 Example: fttng a 3D object model A fnal example s fttng a 3D shape template of a car wth an observaton (mage) of the car. Notce that n ths case the fttng problem can also be nterpreted as a recognton or matchng problem. In fact, fttng, matchng and recognton are nterconnected problems. Fttng, matchng and recognton are nterconnected problems Fttng Fttng becomes problematc n presence of: - nosy data - outlers Crtcal ssues: - nosy data - outlers - mssng data - mssng data We dscuss ths n detals n the next few sldes.

4 Crtcal ssues: nosy data Crtcal ssues: nosy data (ntra-class varablty) Data may not always ft the model exactly because of measurement or detecton nose. Also, the model may not ft exactly the data because of ntra-class varablty the shape of the template may be dfferent from the observaton. Ths s stll true even f there s no measurement nose. A Crtcal ssues: outlers H Another class of problems stem from the presence of outlers. An outler s a data pont that s not explaned by the model but t stll appears among the data pont we wsh to ft. In ths example, the outler s the par of correspondng ponts shown n blue. Ths par of correspondng ponts s clearly ncorrect but nevertheless t may be produced by mstake by the algorthm that generates pont correspondences. The fttng procedure should somehow be able to account for t.

5 Crtcal ssues: mssng data (occlusons) A thrd class of problems s due to occlusons. Ths takes place when some of the ponts we wsh to ft are not vsble. Occlusons become a crtcal ssue n shape fttng (or matchng) problems whereby the observaton of the object we want to match aganst s not fully vsble. Fttng Goal: Choose a parametrc model to ft a certan quantty from data Technques: Least square methods RANSAC Hough transform There are several technques that we can use to ft a parametrc model to the data: Least square methods RANSAC Hough transform EM (Expectaton Maxmzaton) In ths lecture we wll explan the detals of the frst 3 methods; we don t cover the 4 th one n ths course. Let s start wth the least square methods. EM (Expectaton Maxmzaton) [not covered] Data: (x 1, y 1 ),, (x n, y n ) Least squares methods - fttng a lne - y=mx+b We llustrate the least square technque usng the smple example of lne fttng. In ths problem, our goal s to ft a parametrc model that descrbes the equaton of the lne to a seres of ponts labeled as (x 1,y 1 ), (x,y ),, (x n,y n ). If we model the lne as y = mx + b, our goal s to estmate the parameters m, b so as to mnmze the value of E gven n (Eq. ). E s the sum of resduals or how far off our estmate s for each pont. We can nterpret ths as choosng a lne to mnmze the dstance between the lne and the observed data ponts. Lne equaton: y m x b = 0 [Eq. 1] (x, y ) Fnd (m, b) to mnmze E = n = 1 ( y mx b) [Eq. ]

6 E = n = 1 Least squares methods - fttng a lne - ( y mx b) [Eq. ] ' y1 $ ' x1 1$ n - ' m$ * ' m$ E = y [ ] 1 x 1 % " % " + ( = % b " = %! " %!! "% b ", & #) % y " % n x n 1" & # & # & # = Y XB [Eq. 3] We can formulate ths easly as a least squares problem. Eq. descrbes the objectve functon we are tryng to mnmze. We can re-wrte ths equaton n the matrx form expressed n Eq. 3, where B = [m b] T and X and Y collect the coordnates of the data ponts as ndcated n the equaton. By expandng the square of the norm of Y X B, we obtan the expresson n Eq. 4. Our goal s to fnd the parameters m and b that mnmzes E. These can be found by takng the dervatve of E wth respect to B and equal t to zero (Eq. 5). Solvng ths equaton allows to fnd B as expressed n Eq. 6. Ths soluton can be nterpreted as the optmal (n the least squares sense) parameters m, b that mnmze the resduals, or, equvalently, choosng a lne that comes closest to contanng all of the ponts. Eq. 7 s typcally called the normal equaton. T T T T = (Y XB) (Y XB) = Y Y (XB) Y + (XB) (XB) [Eq. 4] Fnd B=[m, b] T that mnmzes E de T T = X Y + X XB = 0 [Eq. 5] db T T X XB = X Y [Eq. 7] Normal equaton B = T 1 T ( X X) X Y [Eq. 6] Least squares methods - fttng a lne - There s a major lmtaton to ths smple method t fals completely for vertcal lnes. In ths case m would be nfnty (or some very large number) whch may ntroduce numercal error n our solutons. E = B n = 1 (y mx b) T 1 T ( X X) X Y & m# = B = $! % b " [Eq. 6] y=mx+b (x, y ) Lmtatons Fals completely for vertcal lnes Dstance between pont (x n, y n ) and lne ax+by=d Least squares methods - fttng a lne - ax+by=d To remedy ths ssue, we can use a very smlar set up wth a small tweak. Instead of the slope-ntercept form of a lne y = mx + b, we can parameterze our lne equaton usng ax + by = d. Ths new parameterzaton leads us to a new expresson for E whch s expressed n Eq. 8. Ths s very smlar to Eq. 1 earler, but parameterzed to avod havng nfnte slope. It can be shown that mnmzng E wth respect to a, b and d s equvalent to solvng the homogenous system expressed n Eq. 9, where h =[a b d] T and the matrx A collects all the coordnates of the data ponts. We leave the detals of ths dervaton as an exercse. We can solve Eq. 9 usng SVD. Fnd (a, b, d) to mnmze the sum of squared perpendcular dstances E = n = 1 ( ax + b y [Eq. 8] d) (x, y ) A h = 0 [Eq. 9] data model parameters

7 A h = 0 Mnmze A h subject to h = 1 T A = UDV Least squares methods - fttng a lne - A s rank defcent As we have seen n prevous lectures, the soluton h of the system n Eq. 9 can be computed as the last column of V, where U*D*V = svd(a). A few more detals kndly offered by Phl: When we take the SVD of a matrx, we are separatng the matrx nto three peces, U (drectons of output), D (a dagonal matrx wth scalng amounts for dfferent drectons), and V (drectons of nput). In ths case, nput to the matrx means beng multpled by a vector on the rght. We can nterpret the SVD as when we have an nput to A that s n the drecton of the nth column of V, t wll get scaled by the nth value of D, and be output as the nth column of U. Thus, when the matrx of scalng factors D s sorted (as s custom/ what matlab does), the last column of V dctates the nput drecton of A that wll have the least output. Therefore, that s the drecton of nput to A that wll get the output closest to zero and s the vector that wll mnmze Ah. h = last column of V y Least squares methods - fttng an homography - H y Another popular problem s the one of fttng the homographc transformaton H to pars of correspondng ponts (see fgure). It can be shown that fndng the H that best fts all the data s equvalent to solvng the homogenous system n Eq. 10, where A collects all the data ponts (.e., coordnates of correspondng ponts n the two mages) and h collects the 9 unknown parameters that descrbe the homographc transformaton H (.e. the coeffcent of H). Remember that H has 8 degrees of freedom n that t s known up to scale. As we dd before, h can be found n the least square sense, through SVD. x x A h = 0 [Eq. 10] data model parameters Least squares: Robustness to nose Least squares methods can work well when the data are nosy. The fgure shows the result of a smulaton where we used least square to ft all the red ponts wth a lne.

8 Least squares: Robustness to nose However, least square s n general not very robust to the presence of outlers as the result shown here demonstrates. The outler marked n blue sgnfcantly alters the qualty of the fttng process. outler! Crtcal ssues: outlers Another example s shown here for the case of fttng homographc transformatons. H CONCLUSION: Least square s not robust w.r.t. outlers [Eq. 1] Least squares: Robust estmators Instead of mnmzng E = E = n = 1 ( u ) ρ ;σ ( ax + b y d) We mnmze [Eq. 11] u = error (resdual) of th pont w.r.t. model parameters h = (a,b,d) ρ = robust functon of u wth scale parameter σ (ρ=rho) ρ [Eq. 8] u = ax + b y d Robust functon ρ: When u s large, ρ saturates to 1 When u s small, ρ s a functon of u To make our estmator less senstve to the presence of outlers, we can use a slghtly modfed cost functon whch s called robust estmator. Instead of mnmzng the sum of the squares of the resduals, we look to mnmze Eq. 11, where the u s are the orgnal resduals that we have been usng and ρ(u ; σ) s a robust functon of the resduals and of the scale parameter σ (Eq. 1). In the lower part of the slde, as an example, we see a plot of ρ(u ; σ) as functon of u and σ. Ths functon s more robust n that t penalzes sets of large resduals and prefer very small resduals. Indeed: When u s large, ρ saturates to 1 When u s small, ρ s a functon of u u In concluson: Favors a confguraton wth small resduals Penalzes large resduals

9 Least squares: Robust estmators Instead of mnmzng E = E = n = 1 ( u ) ρ ;σ ( ax + b y d) We mnmze [Eq. 11] u = error (resdual) of th pont w.r.t. model parameters h = (a,b,d) [Eq. 8] u = ax + b y d The scale parameter σ regulates how much weght we gve to potental outlers. Indeed a small value of sgma hghly penalzes large resduals (whch means we wll end up gvng less weght to the outlers). A large value of sgma mldly penalzes large resduals and act smlarly to Eq. 8 (.e., lke n a standard least square soluton, outlers wll affect our result more). The plot n the fgure shows dfferent profles of ρ(u ; σ) for dfferent values of σ (e.g., σ=0.1, σ=1 and σ=10). ρ = robust functon of u wth scale parameter σ ρ [Eq. 1] Small sgma à hghly penalze large resduals Large sgma à mldly penalze large resdual (lke LSQR) u Least squares: Robust estmators A downsde to usng the robust estmator s that now we have one more parameter to choose what value of sgma to use. Here, we have an example of a good choce of sgma for the earler data. Good scale parameter σ The effect of the outler s elmnated Least squares: Robust estmators Here s an example of a bad choce of sgma. σ s too small and we end up favorng a soluton for the lne that only fts ponts locally (.e., most of the ponts are consdered as outlers). Ths n general produces very unstable results. Bad scale parameter σ (too small!) Fts only locally

10 Least squares: Robust estmators Bad scale parameter σ (too large!) Same as standard LSQ Here s another example of a bad choce of sgma. σ s too large and we end up wth a smlar result to the orgnal least squares functon. In concluson, the robust estmator s useful when we have pror knowledge about what σ should be or some way t to fnd out. Generally, n practce, people use least squares as a good startng pont and use an teratve approach to fnd the optmal robust fttng method (whch s a non lnear problem). CONCLUSION: Robust estmator useful f pror nfo about the dstrbuton of ponts s known Robust fttng s a nonlnear optmzaton problem (teratve soluton) Least squares soluton provdes good ntal condton Fttng Next, we wll talk about RANSAC whch stands for Random Sample Consensus. RANSAC was ntroduced by Fschler & Bolles n Goal: Choose a parametrc model to ft a certan quantty from data Technques: Least square methods RANSAC Hough transform Basc phlosophy (votng scheme) Data elements are used to vote for one (or multple) models Robust to outlers and mssng data The basc dea s that we are gong to set up a votng procedure where each data ponts counts for a vote for one or more models. Ideally, ponts that are nosy wll not vote for the same model and ponts that are relevant wll vote for consstent models. Ths would make our system robust to outlers and mssng data (two of the challenges we mentoned earler). Ths basc dea s shared by RANSAC and Hough Votng (as we shall see next). RANSAC s desgned to be robust to outlers and mssng data and follows two major assumptons: 1: Nosy data ponts wll not vote consstently for any sngle model ( few outlers) : There are enough data ponts to agree on a good model ( few mssng data) Assumpton 1: Nosy data ponts wll not vote consstently for any sngle model ( few outlers) Assumpton : There are enough data ponts to agree on a good model ( few mssng data)

11 RANSAC (RANdom SAmple Consensus) : Fschler & Bolles n 81. We start we a very hgh level overvew of RANSAC. Suppose we have the standard lne fttng problem n presence of outlers. We can formulate ths problem as follows: we want to fnd the best partton of ponts n the nler set P and outler set O such that we keep as many nler ponts as necessary, but that they also result n small resduals (that s, a resdual that s smaller than δ for every nler pont). In Eq.1, r(p,h) s the resdual assocated to the nler pont P and the model parameter h. δ { P O} π : I, such that: mn O r(p, h) < δ, P P r(p, h) = resdual [Eq. 1] π Model parameters RANSAC In practce, RANSAC can be formulated as follows: As before, we have a set of D ponts that we want to ft a lne to. There are three major steps to RANSAC. Sample set = set of ponts n D Algorthm: 1. Select random sample of mnmum requred sze to ft model. Compute a putatve model from sample set 3. Compute the set of nlers to ths model from whole data set Repeat 1-3 untl model wth the most nlers over all samples s found RANSAC Frst, we select a random sample of the mnmum requred sze to ft the model. In ths case, a lne s determned by at least two ponts. Thus, to ft a model of a lne, we select two ponts at random whch consttutes the random sample. In ths example, the random sample s made up by the two green ponts. Sample set = set of ponts n D Algorthm: 1. Select random sample of mnmum requred sze to ft model [?]. Compute a putatve model from sample set 3. Compute the set of nlers to ths model from whole data set Repeat 1-3 untl model wth the most nlers over all samples s found

12 RANSAC Next, we calculate a model from ths sample set. Usng the two green ponts that we have already randomly sampled, we calculate the lne that these two represent. Sample set = set of ponts n D Algorthm: 1. Select random sample of mnmum requred sze to ft model [?]. Compute a putatve model from sample set 3. Compute the set of nlers to ths model from whole data set Repeat 1-3 untl model wth the most nlers over all samples s found RANSAC From here, we see how much of our entre set of D ponts agrees wth ths model up to a tolerance δ. In ths case, we see that all of the ponts that are now blue agree wth the model hypotheszed by the two green ponts. Thus, we have an nler set P of 6 ( green + 4 blue) and an outler set (denoted by O n the slde) of 14. δ Sample set = set of ponts n D Algorthm: 1. Select random sample of mnmum requred sze to ft model [?]. Compute a putatve model from sample set 3. Compute the set of nlers to ths model from whole data set Repeat 1-3 untl model wth the most nlers over all samples s found O =? =14 P =? = 6 RANSAC We repeat these steps untl the number of ponts n the nler set s mnmzed or we meet some other crtera (to be dscussed n the followng sldes). In ths example, we ve chosen the two ponts n green. The model that these two ponts mply would nclude all of the blue ponts. Ths gves us 14 nlers and 6 outlers, whch looks to be a reasonable model. Indeed, ths s the model that maxmzes the nler set. Algorthm: δ O = 6 P =14 1. Select random sample of mnmum requred sze to ft model [?]. Compute a putatve model from sample set 3. Compute the set of nlers to ths model from whole data set Repeat 1-3 untl model wth the most nlers over all samples s found

13 How many samples? Number N of samples requred to ensure, wth a probablty p, that at least one random sample produces an nler set that s free from real outlers. Usually, p=0.99 It s often computatonally unnecessary (and nfeasble) to explore the entre sample space. A typcal assumpton s that only N samples are drawn, where N s the number of samples that s requred to ensure, wth a probablty p, that at least one random sample produces an nler set that s free from real outlers. Usually, p=0.99. Example Here we see an example of a random sample that produces an nler set that contans real outlers. real outler rato s 6/0 = 30% δ Here a random sample s gven by two green ponts Estmated nler set s gven by the green+blue ponts How many real outlers we have here? Example Here we see an example of a random sample that produces an nler set that does not contan any real outler (t s free from real outlers). real outler rato s 6/0 = 30% δ Random sample s gven by two green ponts Estmated nler set s gven by the green+blue ponts How many real outlers we have here? 0

14 How many samples? Number N of samples requred to ensure, wth a probablty p, that at least one random sample produces an nler set that s free from real outlers for a gven s and e. E.g., p=0.99 N = log 1 Estmatng H by RANSAC ( ) ( p) / log 1 ( 1 e) s proporton of outlers e s 5% 10% 0% 5% 30% 40% 50% e = outler rato s = mnmum number needed to ft the model Note: ths table assumes neglgble measurement nose [Eq. 13] N can be estmated usng equaton 13, where e denotes the outler rato (6/0 = 30% n the prevous example), s s the mnmum number of ponts needed to ft the model ( n the prevous example of a lne), and p s the probablty that at least one random sample produces an nler set that s free from real outlers for a gven s and e. The chart shows numbers of samples needed N for a 99% chance that at least one random sample produces an nler set that s free from real outlers. To read ths chart, we have s ponts needed to ft the model n the frst column and the proporton of outlers e n the top row. The crcled example shows that for a model that requres ponts and data that has 30% outlers, we need 7 samples (of two ponts each) to have a 99% chance that at least one random sample produces an nler set that s free from real outlers. A justfcaton of Eq 13 s reported below (kndly provded by Phllp Lee): We can say that p = (1-e)^s. Ths s the chance that f we pcked one set of samples (s ponts), all of them wll be nlers (w/ replacement). Then, we know that 1-p = 1-(1-e)^s. Ths s the chance that f we pcked one set of samples (s ponts), at least one of them wll be an outler. Then, (1-p) = (1-(1-e)^s)^n. Ths s the chance that f we pcked n sets of samples, at least one of them wll contan at least one outler. Wth some algebra, we see that: log(1-p) = log(1-(1-e)^s)^n) Log(1-p) = n*log(1-(1-e)^s) N = log(1-p)/log(1-(1-e)^s) Here are some other examples of when RANSAC can be useful. In ths example the goal s to ft the homographc transformaton H gven a set of pars of correspondng ponts (correspondences) n presence of outlers. In ths example, the blue dashed lnes ndcate correct correspondences whereas the red ones are outlers. The steps of the algorthm are the same as before. The only dfference s the sze of the random sample whch s 4 n ths case. Why? H has 8 degrees of freedom, so we need 4 correspondences (because each pont has two coordnates) to compute a mnmum model. Thus, n ths case s=4. H 8 DOF Need 4 correspondences Outler match Sample set = set of matches between mages Algorthm: 1. Select a random sample of mnmum requred sze [?]. Compute a putatve model from these 3. Compute the set of nlers to ths model from whole sample space Repeat 1-3 untl model wth the most nlers over all samples s found Estmatng F by RANSAC Outler matches In ths example the goal s to ft the fundamental matrx F gven a set of pars of correspondng ponts (correspondences) n presence of outlers. The outlers n ths case are related to the dark red and cyan ponts, whereas all the other ponts are n correct correspondence. In ths case, the random sample s 8. Why? F has 7 degrees of freedom, so we need 7 correspondences (n fact, 8 f we use the 8 ponts algorthm) to compute a mnmum model. Thus, n ths case s=7(8). F 7 DOF Need 7 (8) correspondences Sample set = set of matches between mages Algorthm: 1. Select a random sample of mnmum requred sze [?]. Compute a putatve model from these 3. Compute the set of nlers to ths model from whole sample space Repeat 1-3 untl model wth the most nlers over all samples s found

15 RANSAC - conclusons Good: In concluson, RANSAC s an easly mplementable method to estmate a model that often works well n practce. However, there are many parameters to tune; t may take a long tme to get to the accuracy that you need, and requres the nler to outler rato to be reasonable. Emprcal studes show that RANSAC doesn t work well f the nler/outler raton s greater than 50%. Smple and easly mplementable Successful n dfferent contexts Bad: Many parameters to tune Trade-off accuracy-vs-tme Cannot be used f rato nlers/outlers s too small Fttng The Hough transform s the last method that we ll dscuss today. Goal: Choose a parametrc model to ft a certan quantty from data Technques: Least square methods RANSAC Hough transform Hough transform P.V.C. Hough, Machne Analyss of Bubble Chamber Pctures, Proc. Int. Conf. Hgh Energy Accelerators and Instrumentaton, 1959 Gven a set of ponts, fnd the curve or lne that explans the data ponts best Hough votng s a votng procedure as well, where each data ponts counts for a vote for one or more models that we want to ft to our data. Let s consder the example of lne fttng agan. Our goal s to estmate the parameter m and n of the lne (n dashed red color) that fts our data ponts (n red). Hough votng uses the concept of parameter (or dual) space. If the model we want to ft (.e. the lne) s represented parametrcally by, for nstance, m and n, then we can establsh a relatonshp between the orgnal space where the data ponts le, and the dual parameter space whch s defned by the parameters (.e., m and n) that descrbe the model we want to ft the data to (each axs corresponds to the a model parameter). y y = m x + n x

16 Hough transform P.V.C. Hough, Machne Analyss of Bubble Chamber Pctures, Proc. Int. Conf. Hgh Energy Accelerators and Instrumentaton, 1959 Gven a set of ponts, fnd the curve or lne that explans the data ponts best y (x, y ) (x 1, y 1 ) y = m x + n x m m n y 1 = m x 1 + n y = m x + n n Hough space In ths slde, on the left we have the orgnal mage x,y (Cartesan) coordnates and on the rght, we have the parameter (Hough) space. We want to show that the parameters that descrbe the lne y = m x + n (.e. m and n ) can be computed by ntersectng lnes n the Hough space. Let s pck up a pont (x 1, y 1 ) that belongs to the lne y = m x + n n the orgnal mage. Ths pont corresponds to the lne y 1 = m x 1 + n wth parameters x 1 and y 1 n the Hough space. Let s pck up a second pont (x, y ) that belongs to y = m x + n n the orgnal mage. Ths corresponds to the lne y = m x + n wth parameters x and y n the Hough space. Thus, t s easy to realze that the pont of ntersecton of y 1 = m x 1 + n and y = m x + n n the Hough space returns the coordnates m and n of the lne y = m x + n n the orgnal space. Ths t s true for any arbtrary pont we pck up from the orgnal space as long as t belongs to the lne y = m x + n. Thus, ths mechansm can be used for fttng lnes to ponts: gven a set of data ponts we want to ft a lne to, we assocate for each of these ponts a lne n the Hough space. Then, we ntersect these lnes n the Hough space. The pont of ntersectons returns the parameter of the lne fttng the ponts n the orgnal space. y = m x + n Hough transform As we already dscussed for the least square case, the parameter space [m,n] s unbounded; that s, m and n can range from mnus nfnty to nfnty (thnk of a vertcal lne). P.V.C. Hough, Machne Analyss of Bubble Chamber Pctures, Proc. Int. Conf. Hgh Energy Accelerators and Instrumentaton, 1959 Any Issue? The parameter space [m,n] s unbounded Hough transform P.V.C. Hough, Machne Analyss of Bubble Chamber Pctures, Proc. Int. Conf. Hgh Energy Accelerators and Instrumentaton, 1959 Any Issue? The parameter space [m,n] s unbounded Use a polar representaton for the parameter space Thus, t s common to consder the polar parameterzaton of the lne as shown n Eq. 13, nstead of y = mx + n. Now all possble lnes n Cartesan space have a snusodal profle n hough space. The Hough votng procedure s stll the same n that the parameters rho and theta of the lne (fttng the data ponts n the orgnal space) are estmated as the pont of ntersecton of all the snusodal profles n the Hough space. y ρ ρ θ Orgnal space x Hough space θ x cosθ + ysnθ = ρ [Eq. 13]

17 Hough transform - experments ρ Here s an example of the Hough transform n acton. Orgnal space Hough space θ Hough transform - experments Nosy data ρ When the data ponts are nosy, the correspondng snusodal profles don t ntersect at the exactly same pont. So how do we compute the parameters of the lne n ths case? Let s call a pont of ntersecton of two snusods a vote. The dea s to dvde the Hough space nto grd and count how many votes we have n each cell of the grd. The cell that contans the largest number of votes (the yellow cell n the fgure) returns the parameters of the lne we want to ft. For nstance, such parameters can be estmated as the coordnates of the center of the cell that contans the largest number of vote. The name Hough votng comes from ths concept of countng votes n the Hough space. Orgnal space How to compute the ntersecton pont? IDEA: ntroduce a grd a count ntersecton ponts n each cell Issue: Grd sze needs to be adjusted Hough space θ Whle the dea of usng a grd helps dealng wth the case of nosy data, the grd sze s an addtonal parameter that we need to tune when runnng Hough votng. Small grd szes makes t harder to fnd the cell that contans the largest number of votes. Large grd szes decreases the accuracy n estmatng the parameters of the model (all of the values of θ and ρ wthn the most voted cell are good canddates for descrbng the lne). Hough transform - experments ρ An nterestng case s the one where we have a more or less unform dstrbuton of ponts n the mage (unform nose). In presence of unform nose, we don t fnd a clear consensus of an approprate model n the Hough space. However, random aggregatons of ponts along a lne n the orgnal space may produce consstent votng n certan cells whch result n lowerng the overall sgnal to nose rato. Orgnal space Hough space θ Issue: spurous peaks due to unform nose

18 Hough transform - conclusons Good: All ponts are processed ndependently, so can cope wth occluson/outlers Some robustness to nose: nose ponts unlkely to contrbute consstently to any sngle cell Of course, the Hough transform has pros and cons. An mportant pros s that t can successfully handle some degree of outlers and nose: An outler just produces an addtonal snusod that s very unlkely to matter as we count the number of votes n each cell. Ths s true as long as the outlers don t get aggregated n a way that they consstently vote for another model. The cons nclude the problem of havng unform nose to produce spurous peaks that may decrease the sgnal to nose rato. Also, t s not easy to fnd the rght sze of the grd unless we have some pror knowledge about how data are dstrbuted. Bad: Spurous peaks due to unform nose Trade-off nose-grd sze (hard to fnd sweet pont) Applcatons lane detecton A typcal applcaton of Hough votng (for lnes) s to detect lanes n mages for autonomous navgaton applcatons. Courtesy of Mnchae Lee Applcatons computng vanshng ponts Another typcal applcaton of Hough votng (for lnes) s to estmate vanshng lnes (and thus vanshng ponts).

19 Generalzed Hough transform [more on forthcomng lectures] D. Ballard, Generalzng the Hough Transform to Detect Arbtrary Shapes, Pattern Recognton 13(), 1981 Parameterze a shape by measurng the locaton of ts parts and shape centrod Gven a set of measurements, cast a vote n the Hough (parameter) space We can generalze the Hough votng scheme to fnd shapes that are much more complcated than a straght lne. The general concept s to parameterze a shape n term of the locaton of ts parts wth respect to the shape centrod. Gven a set of measurements, we can cast votes n the Hough (parameter) space; the cell wth the largest number of votes returns the probablty that the shape has been found. Ths dea has been used n desgnng methods for object detecton (see for nstance the work by Lebe et al., 004). We wll dscuss ths n detals n one of the upcomng lectures. Used n object recognton! (the mplct shape model) B. Lebe, A. Leonards, and B. Schele, Combned Object Categorzaton and Segmentaton wth an Implct Shape Model, ECCV Workshop on Statstcal Learnng n Computer Vson 004 Lecture 9 Fttng and Matchng We now dscuss the case of fttng data when these data can be explaned by more than one model. Problem formulaton Least square methods RANSAC Hough transforms Mult- model fttng Fttng helps matchng! Slvo Savarese Lecture 8-4-Feb-15 Fttng multple models Here s an example: gven ths dstrbuton of D ponts, we want to: - Dscover how many lnes explan the ponts say. - Estmate the parameters of these lnes that best fts the ponts. All of ths n presence of nose and outlers. There are many approaches to ths problem. Incremental fttng (sweepng through possble numbers of lnes), Expectaton maxmzaton (not covered n ths class, but usng the ft of a model to estmate how many lnes should be ft, then usng the estmate to ft the model n an teratve fashon), and the Hough transform (to vote for how many lnes are needed whle estmatng ther parameters). Incremental fttng E.M. (probablstc fttng) Hough transform

20 Incremental lne fttng Scan data pont sequentally (usng localty constrants) The dea of ncremental lne fttng s farly straghtforward. We ft a lne to N pont and calculate our resduals. Then we add a new pont and re-ft the lne and resdual. If the resdual ever goes over a threshold, we add another lne to our model. We contnue untl all (or enough) ponts are accounted for. Perform followng loop: 1. Select N pont and ft lne to N ponts. Compute resdual R N 3. Add a new pont, re-ft lne and re-compute R N+1 4. Contnue whle lne fttng resdual s small enough, When resdual exceeds a threshold, start fttng new model (lne) Courtesy of unknown Hough transform Hough votng provdes an appealng alternatve to ncremental fttng. The problem s essentally solved usng the same procedure as dscussed before. If data ponts are explaned by multple lnes (two n ths case), we wll observe more than one cell that contans a number of votes that s sgnfcantly hgher than those due to nose. The fgure shows an example where Hough votng s appled to the pont dstrbuton on the left hand sde. In the rght fgure, the n-axs and m-axs descrbe the Hough votng space, and the vertcal axs (counter) reports the number of votes that appears n each cell. We can clearly see two peaks (hence two lnes) emergng from all the other peaks that are due to nose. The parameters assocated to these peaks characterze the two lnes that ft the ponts n the orgnal space. Same cons and pros as before Lecture 9 Fttng and Matchng We wll wrap up ths lecture wth how fttng (estmatng models) nteract wth matchng (fndng correspondences between mages). Problem formulaton Least square methods RANSAC Hough transforms Mult- model fttng Fttng helps matchng! Slvo Savarese Lecture 8-4-Feb-15

21 Fttng helps matchng! Image 1 Image A common task n vson s to match feature ponts between mages. Each mage n the fgure shows a number of detected ponts obtaned, for nstance, usng a feature detector; these are are depcted as whte ponts. Feature ponts can be matched across mages usng a smlarty crteron such as the cross correlaton (see lecture 6) that s computed over a small wndow around the feature of nterest. wndow wndow Features are matched (for nstance, based on correlaton) Fttng helps matchng! Image 1 Image Unfortunately, for a number of reasons that we already extensvely dscussed durng lecture 6, ths matchng process s faulty and can produce a large number of msmatched (outlers) whch are ndcated n red n the fgure. A technque such as RANSAC can help. Suppose that these mages are related by an homographc transformaton H. Ths s actually true n ths case, snce the scene we are observng s very far from the camera. Then, we can run RANSAC to estmate the H that relates correspondng ponts. Not only does RANSAC produce a robust estmaton of H by rejectng the outlers, but also returns the lst of outlers (.e., the msmatches). Ths example shows how matchng and fttng are very much nterrelated problems. Matches based on appearance only Green: good matches Red: bad matches Idea: Fttng an homography H (by RANSAC) mappng features from mages 1 to Bad matches wll be labeled as outlers (hence rejected)! Fttng helps matchng! Usng our homography, we can be use to blend the two mages on top of each other after.

22 Recognsng Panoramas M. Brown and D. G. Lowe. Recognsng Panoramas. In Proceedngs of the 9th Internatonal Conference on Computer Vson -- ICCV003 Ths approach can be used to sttch mages so as to form panoramc mages and t s currently commonly used n many commercal applcatons for panorama processng. Next lecture: Feature detectors and descrptors Back up sldes for Least squares methods. Please see Lnear algebra

23 Least squares methods - fttng a lne - We saw earler that the Least Squares soluton to Ax = b s x = (A^T A)^-1 A^T b. Ths means that gven an A and b, ths s the value of x that comes closest to Ax=b wth respect to the cost functon Ax-b. X may not be an exact soluton when A s tall and full rank (meanng we have more equatons than unknowns). Ax = b More equatons than unknowns Look for soluton whch mnmzes Ax-b = (Ax-b) T (Ax-b) T Solve ( Ax b) ( Ax b) = 0 x LS soluton x = ( A A) T 1 A T b Least squares methods - fttng a lne - Another way to look at the least squares soluton s as a sngular valude decomposton (SVD) problem. The pseudo-nverse A-dagger, where x = (A-dagger) y can also be derved from SVD. Solvng x = ( A A) t 1 t A b A + t 1 = (A A) A = U V t A t = pseudo-nverse of A = SVD decomposton of A A A = V = V + U U + 1 wth equal to for all nonzero sngular values and zero otherwse Least squares methods - fttng an homography - To ft a homography, wth 4 ponts, we fnd the least squares soluton to Ah=0, where A and h are defned above. The soluton to ths wll be the last rght sngular vector of A. From n>=4 correspondng ponts: A h = 0 & h $ $ h $! $ % h 1,1 1, 3,3 #!! = 0!! "

24 Hough transform - experments Here s an example of very nosy data. It does not seem lke a lne should ft most of the ponts n Cartesan space. Accordngly, n Hough space, there does not seem to be a clear consensus of an approprate model. However there does seem to be a few ponts that seem to be potental lnes. These may correspond to lnes that would ft some of the data ponts, but not all of them. features votes Issue: spurous peaks due to unform nose

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros. Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros. How do we buld panorama? We need to match (algn) mages Matchng wth Features Detect feature ponts n both

More information

LEAST SQUARES. RANSAC. HOUGH TRANSFORM.

LEAST SQUARES. RANSAC. HOUGH TRANSFORM. LEAS SQUARES. RANSAC. HOUGH RANSFORM. he sldes are from several sources through James Has (Brown); Srnvasa Narasmhan (CMU); Slvo Savarese (U. of Mchgan); Bll Freeman and Antono orralba (MI), ncludng ther

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Lecture 9 Fitting and Matching

Lecture 9 Fitting and Matching Lecture 9 Fitting and Matching Problem formulation Least square methods RANSAC Hough transforms Multi- model fitting Fitting helps matching! Reading: [HZ] Chapter: 4 Estimation 2D projective transformation

More information

Lecture 8 Fitting and Matching

Lecture 8 Fitting and Matching Lecture 8 Fitting and Matching Problem formulation Least square methods RANSAC Hough transforms Multi-model fitting Fitting helps matching! Reading: [HZ] Chapter: 4 Estimation 2D projective transformation

More information

Multi-stable Perception. Necker Cube

Multi-stable Perception. Necker Cube Mult-stable Percepton Necker Cube Spnnng dancer lluson, Nobuuk Kaahara Fttng and Algnment Computer Vson Szelsk 6.1 James Has Acknowledgment: Man sldes from Derek Hoem, Lana Lazebnk, and Grauman&Lebe 2008

More information

EECS 442 Computer vision. Fitting methods

EECS 442 Computer vision. Fitting methods EECS 442 Computer vision Fitting methods - Problem formulation - Least square methods - RANSAC - Hough transforms - Multi-model fitting - Fitting helps matching! Reading: [HZ] Chapters: 4, 11 [FP] Chapters:

More information

Fitting and Alignment

Fitting and Alignment Fttng and Algnment Computer Vson Ja-Bn Huang, Vrgna Tech Many sldes from S. Lazebnk and D. Hoem Admnstratve Stuffs HW 1 Competton: Edge Detecton Submsson lnk HW 2 wll be posted tonght Due Oct 09 (Mon)

More information

Image Alignment CSC 767

Image Alignment CSC 767 Image Algnment CSC 767 Image algnment Image from http://graphcs.cs.cmu.edu/courses/15-463/2010_fall/ Image algnment: Applcatons Panorama sttchng Image algnment: Applcatons Recognton of object nstances

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Fitting. Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! EECS Fall 2014! Foundations of Computer Vision!

Fitting. Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! EECS Fall 2014! Foundations of Computer Vision! Fitting EECS 598-08 Fall 2014! Foundations of Computer Vision!! Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! Readings: FP 10; SZ 4.3, 5.1! Date: 10/8/14!! Materials on these

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

Active Contours/Snakes

Active Contours/Snakes Actve Contours/Snakes Erkut Erdem Acknowledgement: The sldes are adapted from the sldes prepared by K. Grauman of Unversty of Texas at Austn Fttng: Edges vs. boundares Edges useful sgnal to ndcate occludng

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following.

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following. Complex Numbers The last topc n ths secton s not really related to most of what we ve done n ths chapter, although t s somewhat related to the radcals secton as we wll see. We also won t need the materal

More information

Image warping and stitching May 5 th, 2015

Image warping and stitching May 5 th, 2015 Image warpng and sttchng Ma 5 th, 2015 Yong Jae Lee UC Davs PS2 due net Frda Announcements 2 Last tme Interactve segmentaton Feature-based algnment 2D transformatons Affne ft RANSAC 3 1 Algnment problem

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

Analysis of Continuous Beams in General

Analysis of Continuous Beams in General Analyss of Contnuous Beams n General Contnuous beams consdered here are prsmatc, rgdly connected to each beam segment and supported at varous ponts along the beam. onts are selected at ponts of support,

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

MOTION PANORAMA CONSTRUCTION FROM STREAMING VIDEO FOR POWER- CONSTRAINED MOBILE MULTIMEDIA ENVIRONMENTS XUNYU PAN

MOTION PANORAMA CONSTRUCTION FROM STREAMING VIDEO FOR POWER- CONSTRAINED MOBILE MULTIMEDIA ENVIRONMENTS XUNYU PAN MOTION PANORAMA CONSTRUCTION FROM STREAMING VIDEO FOR POWER- CONSTRAINED MOBILE MULTIMEDIA ENVIRONMENTS by XUNYU PAN (Under the Drecton of Suchendra M. Bhandarkar) ABSTRACT In modern tmes, more and more

More information

Fitting: Deformable contours April 26 th, 2018

Fitting: Deformable contours April 26 th, 2018 4/6/08 Fttng: Deformable contours Aprl 6 th, 08 Yong Jae Lee UC Davs Recap so far: Groupng and Fttng Goal: move from array of pxel values (or flter outputs) to a collecton of regons, objects, and shapes.

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

Biostatistics 615/815

Biostatistics 615/815 The E-M Algorthm Bostatstcs 615/815 Lecture 17 Last Lecture: The Smplex Method General method for optmzaton Makes few assumptons about functon Crawls towards mnmum Some recommendatons Multple startng ponts

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

What are the camera parameters? Where are the light sources? What is the mapping from radiance to pixel color? Want to solve for 3D geometry

What are the camera parameters? Where are the light sources? What is the mapping from radiance to pixel color? Want to solve for 3D geometry Today: Calbraton What are the camera parameters? Where are the lght sources? What s the mappng from radance to pel color? Why Calbrate? Want to solve for D geometry Alternatve approach Solve for D shape

More information

Problem Set 3 Solutions

Problem Set 3 Solutions Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

Structure from Motion

Structure from Motion Structure from Moton Structure from Moton For now, statc scene and movng camera Equvalentl, rgdl movng scene and statc camera Lmtng case of stereo wth man cameras Lmtng case of multvew camera calbraton

More information

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Vanishing Hull. Jinhui Hu, Suya You, Ulrich Neumann University of Southern California {jinhuihu,suyay,

Vanishing Hull. Jinhui Hu, Suya You, Ulrich Neumann University of Southern California {jinhuihu,suyay, Vanshng Hull Jnhu Hu Suya You Ulrch Neumann Unversty of Southern Calforna {jnhuhusuyay uneumann}@graphcs.usc.edu Abstract Vanshng ponts are valuable n many vson tasks such as orentaton estmaton pose recovery

More information

USING GRAPHING SKILLS

USING GRAPHING SKILLS Name: BOLOGY: Date: _ Class: USNG GRAPHNG SKLLS NTRODUCTON: Recorded data can be plotted on a graph. A graph s a pctoral representaton of nformaton recorded n a data table. t s used to show a relatonshp

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009.

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009. Farrukh Jabeen Algorthms 51 Assgnment #2 Due Date: June 15, 29. Assgnment # 2 Chapter 3 Dscrete Fourer Transforms Implement the FFT for the DFT. Descrbed n sectons 3.1 and 3.2. Delverables: 1. Concse descrpton

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Wavefront Reconstructor

Wavefront Reconstructor A Dstrbuted Smplex B-Splne Based Wavefront Reconstructor Coen de Vsser and Mchel Verhaegen 14-12-201212 2012 Delft Unversty of Technology Contents Introducton Wavefront reconstructon usng Smplex B-Splnes

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Supervsed vs. Unsupervsed Learnng Up to now we consdered supervsed learnng scenaro, where we are gven 1. samples 1,, n 2. class labels for all samples 1,, n Ths s also

More information

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning Computer Anmaton and Vsualsaton Lecture 4. Rggng / Sknnng Taku Komura Overvew Sknnng / Rggng Background knowledge Lnear Blendng How to decde weghts? Example-based Method Anatomcal models Sknnng Assume

More information

CMPS 10 Introduction to Computer Science Lecture Notes

CMPS 10 Introduction to Computer Science Lecture Notes CPS 0 Introducton to Computer Scence Lecture Notes Chapter : Algorthm Desgn How should we present algorthms? Natural languages lke Englsh, Spansh, or French whch are rch n nterpretaton and meanng are not

More information

Model Fitting מבוסס על שיעור שנבנה ע"י טל הסנר

Model Fitting מבוסס על שיעור שנבנה עי טל הסנר Model Fttng מבוסס על שיעור שנבנה ע"י טל הסנר מקורות מפוזר על פני ספר הלימוד... Fttng: Motvaton We ve learned how to detect edges, corners, blobs. Now what? We would lke to form a hgher-level, more compact

More information

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1) Secton 1.2 Subsets and the Boolean operatons on sets If every element of the set A s an element of the set B, we say that A s a subset of B, or that A s contaned n B, or that B contans A, and we wrte A

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Reducing Frame Rate for Object Tracking

Reducing Frame Rate for Object Tracking Reducng Frame Rate for Object Trackng Pavel Korshunov 1 and We Tsang Oo 2 1 Natonal Unversty of Sngapore, Sngapore 11977, pavelkor@comp.nus.edu.sg 2 Natonal Unversty of Sngapore, Sngapore 11977, oowt@comp.nus.edu.sg

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Fitting: Voting and the Hough Transform

Fitting: Voting and the Hough Transform Fttng: Votng and the Hough Transform Thurs Sept 4 Krsten Grauman UT Austn Last tme What are groupng problems n vson? Inspraton from human percepton Gestalt propertes Bottom-up segmentaton va clusterng

More information

AP PHYSICS B 2008 SCORING GUIDELINES

AP PHYSICS B 2008 SCORING GUIDELINES AP PHYSICS B 2008 SCORING GUIDELINES General Notes About 2008 AP Physcs Scorng Gudelnes 1. The solutons contan the most common method of solvng the free-response questons and the allocaton of ponts for

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Lecture #15 Lecture Notes

Lecture #15 Lecture Notes Lecture #15 Lecture Notes The ocean water column s very much a 3-D spatal entt and we need to represent that structure n an economcal way to deal wth t n calculatons. We wll dscuss one way to do so, emprcal

More information

Exercises (Part 4) Introduction to R UCLA/CCPR. John Fox, February 2005

Exercises (Part 4) Introduction to R UCLA/CCPR. John Fox, February 2005 Exercses (Part 4) Introducton to R UCLA/CCPR John Fox, February 2005 1. A challengng problem: Iterated weghted least squares (IWLS) s a standard method of fttng generalzed lnear models to data. As descrbed

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

CSCI 104 Sorting Algorithms. Mark Redekopp David Kempe

CSCI 104 Sorting Algorithms. Mark Redekopp David Kempe CSCI 104 Sortng Algorthms Mark Redekopp Davd Kempe Algorthm Effcency SORTING 2 Sortng If we have an unordered lst, sequental search becomes our only choce If we wll perform a lot of searches t may be benefcal

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

Brave New World Pseudocode Reference

Brave New World Pseudocode Reference Brave New World Pseudocode Reference Pseudocode s a way to descrbe how to accomplsh tasks usng basc steps lke those a computer mght perform. In ths week s lab, you'll see how a form of pseudocode can be

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices Steps for Computng the Dssmlarty, Entropy, Herfndahl-Hrschman and Accessblty (Gravty wth Competton) Indces I. Dssmlarty Index Measurement: The followng formula can be used to measure the evenness between

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

An efficient method to build panoramic image mosaics

An efficient method to build panoramic image mosaics An effcent method to buld panoramc mage mosacs Pattern Recognton Letters vol. 4 003 Dae-Hyun Km Yong-In Yoon Jong-Soo Cho School of Electrcal Engneerng and Computer Scence Kyungpook Natonal Unv. Abstract

More information

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016)

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016) Technsche Unverstät München WSe 6/7 Insttut für Informatk Prof. Dr. Thomas Huckle Dpl.-Math. Benjamn Uekermann Parallel Numercs Exercse : Prevous Exam Questons Precondtonng & Iteratve Solvers (From 6)

More information

5 The Primal-Dual Method

5 The Primal-Dual Method 5 The Prmal-Dual Method Orgnally desgned as a method for solvng lnear programs, where t reduces weghted optmzaton problems to smpler combnatoral ones, the prmal-dual method (PDM) has receved much attenton

More information

Sorting Review. Sorting. Comparison Sorting. CSE 680 Prof. Roger Crawfis. Assumptions

Sorting Review. Sorting. Comparison Sorting. CSE 680 Prof. Roger Crawfis. Assumptions Sortng Revew Introducton to Algorthms Qucksort CSE 680 Prof. Roger Crawfs Inserton Sort T(n) = Θ(n 2 ) In-place Merge Sort T(n) = Θ(n lg(n)) Not n-place Selecton Sort (from homework) T(n) = Θ(n 2 ) In-place

More information

A Comparison and Evaluation of Three Different Pose Estimation Algorithms In Detecting Low Texture Manufactured Objects

A Comparison and Evaluation of Three Different Pose Estimation Algorithms In Detecting Low Texture Manufactured Objects Clemson Unversty TgerPrnts All Theses Theses 12-2011 A Comparson and Evaluaton of Three Dfferent Pose Estmaton Algorthms In Detectng Low Texture Manufactured Objects Robert Krener Clemson Unversty, rkrene@clemson.edu

More information

CS 231A Computer Vision Midterm

CS 231A Computer Vision Midterm CS 231A Computer Vson Mdterm Tuesday October 30, 2012 Set 1 Multple Choce (20 ponts) Each queston s worth 2 ponts. To dscourage random guessng, 1 pont wll be deducted for a wrong answer on multple choce

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

MOTION BLUR ESTIMATION AT CORNERS

MOTION BLUR ESTIMATION AT CORNERS Gacomo Boracch and Vncenzo Caglot Dpartmento d Elettronca e Informazone, Poltecnco d Mlano, Va Ponzo, 34/5-20133 MILANO boracch@elet.polm.t, caglot@elet.polm.t Keywords: Abstract: Pont Spread Functon Parameter

More information

12. Segmentation. Computer Engineering, i Sejong University. Dongil Han

12. Segmentation. Computer Engineering, i Sejong University. Dongil Han Computer Vson 1. Segmentaton Computer Engneerng, Sejong Unversty Dongl Han Image Segmentaton t Image segmentaton Subdvdes an mage nto ts consttuent regons or objects - After an mage has been segmented,

More information

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline mage Vsualzaton mage Vsualzaton mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and Analyss outlne mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and

More information

AIMS Computer vision. AIMS Computer Vision. Outline. Outline.

AIMS Computer vision. AIMS Computer Vision. Outline. Outline. AIMS Computer Vson 1 Matchng, ndexng, and search 2 Object category detecton 3 Vsual geometry 1/2: Camera models and trangulaton 4 Vsual geometry 2/2: Reconstructon from multple vews AIMS Computer vson

More information

Exact solution, the Direct Linear Transfo. ct solution, the Direct Linear Transform

Exact solution, the Direct Linear Transfo. ct solution, the Direct Linear Transform Estmaton Basc questons We are gong to be nterested of solvng e.g. te followng estmaton problems: D omograpy. Gven a pont set n P and crespondng ponts n P, fnd te omograpy suc tat ( ) =. Camera projecton.

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Concurrent Apriori Data Mining Algorithms

Concurrent Apriori Data Mining Algorithms Concurrent Apror Data Mnng Algorthms Vassl Halatchev Department of Electrcal Engneerng and Computer Scence York Unversty, Toronto October 8, 2015 Outlne Why t s mportant Introducton to Assocaton Rule Mnng

More information

Intra-Parametric Analysis of a Fuzzy MOLP

Intra-Parametric Analysis of a Fuzzy MOLP Intra-Parametrc Analyss of a Fuzzy MOLP a MIAO-LING WANG a Department of Industral Engneerng and Management a Mnghsn Insttute of Technology and Hsnchu Tawan, ROC b HSIAO-FAN WANG b Insttute of Industral

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information