Fitting (LMedS, RANSAC) Thursday, 23/03/2017 Antonis Argyros e-mail: argyros@csd.uoc.gr LMedS and RANSAC What if we have very many outliers? 2 1
Least Median of Squares ri : Residuals Least Squares n 2 r i i=1 Minimize Best fit of ALL observations Minimize LMedS 2 median{ ri} i=1... n Best fit of THE MAJORITY of observations ``Rousseeuw PJ and Leroy AM ``Robust Regression and Outlier Detection, John Wiley and Sons Inc.., New York, 1987 3 Least Median of Squares: algorithm Algorithm: 1. Set to r min to infinity 2. Sample (randomly) the number s of points required to fit the model 3. Solve for model parameters h cur using samples 4. For all observations (points) compute the residuals r i (i.e., distance of observations compared to what is predicted by the model) 5. Find the median residual r cur 6. If r cur < r min then r min := r cur h best = h cur endif Repeat 2-6 until the best model is found with high confidence * Return (h best ) * r min is below a threshold or an upper bound of repetitions have been performed 4 2
The behavior of LMedS Least squares Least Squares LMedS no outliers (outlier) (outlier) 5 How to choose parameters? Number of sampled points s Minimum number needed to fit the model Number of samples N Choose N so that, with probability p (e.g. p=0.99), at least one set of s random sample is free from outliers (outlier ratio: e) s ( ( e) ) 1 p= 1 1 s ( ) ( ) ( ) N = log 1 p / log 1 1 e N proportion of outliers e s 5% 10% 20% 25% 30% 40% 50% 2 2 3 5 6 7 11 17 3 3 4 7 9 11 19 35 4 3 5 9 13 17 34 72 5 4 6 12 17 26 57 146 6 4 7 16 24 37 97 293 7 4 8 20 33 54 163 588 8 5 9 26 44 78 272 1177 modified from M. Pollefeys 3
Breakdown point of an estimator Breakdown point of an estimator: Amount (percentage) of outliers that can make the solution arbitrarily bad. Least squares method: Breakdown point tends to 0% since a single outlier can make the solution arbitrarily bad!... non-robust estimation algorithm! Least Median of Squares method: Breakdown point = 50% Median residual is not affected by 50%- of outliers robust estimation technique 7 RANSAC RANdom SAmple Consensus δ { } π : I P, O such that: f (P, β ) <δ min π O f ( P, β ) = β Model parameters T 1 T ( P P) P M. A. Fischler, R. C. Bolles. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Comm. of the ACM, Vol 24, pp 381-395, 1981. Source: Savarese 4
RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence RANSAC Line fitting example Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Illustration by Savarese 5
RANSAC Line fitting example Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence RANSAC Line fitting example N I = 6 δ Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence 6
RANSAC Algorithm: δ N I = 14 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence How to choose parameters? Number of samples N Choose N so that, with probability p (e.g. p=0.99), at least one set of s random samples is free from outliers (outlier ratio: e) Number of sampled points s Minimum number needed to fit the model Distance threshold δ Choose δ so that a good point with noise is likely (e.g., prob=0.95) s ( ( e) ) 1 p= 1 1 s ( ) ( ) ( ) N = log 1 p / log 1 1 e N proportion of outliers e s 5% 10% 20% 25% 30% 40% 50% 2 2 3 5 6 7 11 17 3 3 4 7 9 11 19 35 4 3 5 9 13 17 34 72 5 4 6 12 17 26 57 146 6 4 7 16 24 37 97 293 7 4 8 20 33 54 163 588 8 5 9 26 44 78 272 1177 modified from M. Pollefeys 7
RANSAC conclusions Pros Applicable for larger number of objective function parameters than Hough transform Simple and general - Applicable to many different problems Robust to outliers - Works well in practice Cons Lots of parameters to tune Can t always get a good initialization of the model based on the minimum number of samples Sometimes too many iterations are required Can fail for extremely low inlier ratios We can often do better than brute-force sampling LMedS vs RANSAC Question: What is the difference between LMedS and RANSAC; LMedS: Seeks for the narrowest stripe that contains 50% of the points. RANSAC: Seeks for the most densely populated stripe of a user-defined size 16 8
Particle Swarm Optimization (PSO) Based on social sciences Swarm intelligence: collective behavior of distributed systems Populations of simple entities that communicate locally among themselves and with their environment Simple, local behavioral rules, random to a small extend Lack of central control Intelligent global behavior, that remains visible only to the eyes of the external observer Particles: Possible solutions of a problem Particle behavior: Investigate the parameter space to find optimal solution Move according to: Inertia Cognitive component Social component J.Kennedy and R.Eberhart. Particle swarm optimization. IEEE CNN, 1995. 17 Canonical PSO An objective function F( x i ) R evaluates hypotheses Each particle has a position, a velocity and a history x v t t i : : {,, } m t t i p : best hypothesis for the whole population (up to time t) g P = x v p current position at time t current velocity at time t p : best hypothesis for the i particle (up to time t) Applies iterative updates known as generations PSO ends when some user defined criterion is met 18 9
Update equations In every generation the following are estimated and ( 1 1 1( 1) 2 2( 1) ) v = K v + c r p x + c r p x t t i t g t x = x + v t t 1 t where K is a constant constriction factor defined as 2 K =, ψ = c1 + c2 2 2 ψ ψ 4ψ c 1 1 2 : cognitive component c2 : social component r, r : random samples U[0, 1] p, p are updated accordingly after computing the obj. funct. 19 i g PSO for line fitting Assume particles randomly distributed in the space of line parameters (means that each particle represents a line equation) Each particle evaluates how good it is (a RANSAC-type of evaluation would fit) Each particle moves in the swarm based on the PSO scheme Convergence leads to the estimation to the best line 20 10
3D hand tracking based on RGB-D images Observations O: 3D structure of the skin colored regions in each RGBD frame Model: One hand (37 geometric primitives, 20 intrinsic + 6 extrinsic = 26 DoFs = h) Hypothesis testing (E): Rank each hypothesis h based on its compatibility E with the observations O Formulate an optimization problem to minimize the discrepancies between the hypothesis h and the observations PSO O * h = arg min E( h, O) { } Hypothesis formulation and optimization: Particle Swarm Optimization on a 27D parameter space h 3D hand tracking based on RGB-D images Physical plausibility: Respect anatomical constraints (hand dimensions, kinematic constraints of joints, penalize interpenetration of hand parts) I. Oikonomidis, N. Kyriazis, A.A. Argyros, Efficient model based 3D tracking of hand articulations using Kinect, BMVC 11 FORTH 3D Hand Tracking Library: https://cvrlcode.ics.forth.gr/handtracking 1st place award, CHALEARN Gesture Recognition Demonstration Competition, ICPR 2012, Tsukuba, Japan, Nov 2012 11
Joint Tracking (JT) Hand + single rigid object, multicamera setup Hand + another hand, RGBD camera I. Oikonomidis, N. Kyriazis, A.A. Argyros, Full DOF tracking of a hand interacting with an object by modeling occlusions and physical constraints, ICCV 2011 I. Oikonomidis, N. Kyriazis, A.A. Argyros, Tracking the articulated motion of two strongly interacting hands, CVPR 2012 Experiments: disassembly task 2 articulated hands, 2x27 params = 54 parameters 15 rigid objects, 15x7 params = 105 parameters Problem dimensionality: 159-D parameter space N. Kyriazis and A.A. Argyros, "Scalable 3D Tracking of Multiple Interacting Objects", CVPR 2014 Antonis Argyros, Computer Science Department, Univ. of Crete Institute of Computer Science, FORTH argyros@ics.forth.gr, http://users.ics.forth.gr/~argyros 12
Tracking a scene by tracking the actor N. Kyriazis, A. A. Argyros, Physically plausible 3D scene tracking: The single actor hypothesis, ICCV 2013, Current state Release of the 3D Hand Tracker library: (http://cvrlcode.ics.forth.gr/handtracking/) that now offers: Depth based hand segmentation. No need for skin color. Fine grained access to the tracking pipeline steps through a Python API. (https://github.com/forth-modelbasedtracker/handtracker). Support for two hands and hand-object tracking. Improved performance (30+ fps) on modern GPUs. Fixes of bugs that caused problems with some NVidia drivers and GPUs. 13
Different problems, same framework Head pose estimation Full body tracking P. Padeleris, X. Zabulis and A.A. Argyros, Head pose estimation on depth data based on Particle Swarm Optimization, CVPRW-HAU3D 2012 D.Michel, C. Panagiotakis, A.A. Argyros, Tracking the articulated motion of the human body based on two RGBD cameras, MVA 2014 Simultaneous localization, mapping and moving object detection P. Panteleris, A.A. Argyros Vision-based SLAM and moving objects tracking for the perceptual support of a smart walker platform, Workshop on Assistive Computer Vision and Robotics (ACVR 2014), in conjunction with ECCV 2014, Zurich, Switzerland, Sep. 12, 2014. 14
PSO vs RANSAC PSO: Model-driven / top-down: Assume particles (=model parameters), evaluate them and modify them Convergence to solution is based on swarm intelligence. Focus on models RANSAC: Data-driven / bottom-up: Convergence to solution by randomly testing data samples Focus on data Fitting multiple lines Voting strategies Hough transform RANSAC Other approaches Incremental line fitting K-lines 15
Incremental line fitting Examine edge points in their order along an edge chain Fit line to consecutive points: While line fitting residual is small enough, continue adding points to the current line and refitting When residual exceeds a threshold, break off current line and start a new one with the rest of unassigned points Incremental line fitting 16
Incremental line fitting Incremental line fitting 17
Incremental line fitting Incremental fitting pros and cons Pros Exploits locality Adaptively determines the number of lines Cons Needs sequential ordering of features Can t cope with occlusion Sensitive to noise and choice of threshold 18
K-Lines Initialize k lines Option 1: Randomly initialize k sets of parameters Option 2: Randomly partition points into k sets and fit lines to them Iterate until convergence: Assign each point to the nearest line Refit parameters for each line K-Lines example 1 initialization 19
K-Lines example 1 iteration 1: assignment to nearest line K-Lines example 1 iteration 1: refitting 20
K-Lines example 2 initialization K-Lines example 2 iteration 1: assignment to nearest line 21
K-Lines example 2 iteration 1: refitting K-Lines example 2 iteration 2: assignment to nearest line 22
K-Lines example 2 iteration 2: refitting K-Lines example 3 initialization 23
K-Lines example 3 iteration 2: assignment to nearest line K-Lines example 3 iteration 1: refitting 24
K-Lines pros and cons Pros Guaranteed to reduce line fitting residual at each iteration Can cope with occlusion Cons Need to know k Can get stuck in local minima Sensitive to initialization 25