Three-Dimensional Sensors Lecture 6: Point-Cloud Registration

Similar documents
Scan Matching. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Clustering. K-means clustering

CS 1675 Introduction to Machine Learning Lecture 18. Clustering. Clustering. Groups together similar instances in the data sample

3D Point Cloud Processing

Clustering Lecture 5: Mixture Model

Using Augmented Measurements to Improve the Convergence of ICP. Jacopo Serafin and Giorgio Grisetti

Note Set 4: Finite Mixture Models and the EM Algorithm

Unsupervised Learning

( ) =cov X Y = W PRINCIPAL COMPONENT ANALYSIS. Eigenvectors of the covariance matrix are the principal components

Introduction to Mobile Robotics Iterative Closest Point Algorithm. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras

K-Means Clustering. Sargur Srihari

Content-based image and video analysis. Machine learning

Uncertainties: Representation and Propagation & Line Extraction from Range data

INFO0948 Fitting and Shape Matching

22 October, 2012 MVA ENS Cachan. Lecture 5: Introduction to generative models Iasonas Kokkinos

CS 2750 Machine Learning. Lecture 19. Clustering. CS 2750 Machine Learning. Clustering. Groups together similar instances in the data sample

Clustering K-means. Machine Learning CSEP546 Carlos Guestrin University of Washington February 18, Carlos Guestrin

K-Means and Gaussian Mixture Models

Supervised vs. Unsupervised Learning

Mixture Models and EM

Final Exam Assigned: 11/21/02 Due: 12/05/02 at 2:30pm

Introduction to Machine Learning CMU-10701

Gaussian Mixture Models For Clustering Data. Soft Clustering and the EM Algorithm

Iterative Closest Point Algorithm in the Presence of Anisotropic Noise

IBL and clustering. Relationship of IBL with CBR

Unsupervised Learning: Clustering

Introduction to Mobile Robotics

The K-modes and Laplacian K-modes algorithms for clustering

5.2 Surface Registration

CS 664 Structure and Motion. Daniel Huttenlocher

MultiDimensional Signal Processing Master Degree in Ingegneria delle Telecomunicazioni A.A

University of Florida CISE department Gator Engineering. Clustering Part 2

Motivation. Technical Background

Surface Registration. Gianpaolo Palma

Deformable Object Registration

SYDE Winter 2011 Introduction to Pattern Recognition. Clustering

Fundamentals of 3D. Lecture 4: Debriefing: ICP (kd-trees) Homography Graphics pipeline. Frank Nielsen 5 octobre 2011

CS 229 Midterm Review

Big Data Infrastructure CS 489/698 Big Data Infrastructure (Winter 2017)

Machine Learning. Unsupervised Learning. Manfred Huber

Expectation Maximization (EM) and Gaussian Mixture Models

9.1. K-means Clustering

Obtaining Feature Correspondences

INF 4300 Classification III Anne Solberg The agenda today:

Three-Dimensional Sensors Lecture 2: Projected-Light Depth Cameras

Point Cloud Processing

Mixture Models and the EM Algorithm

CPSC 340: Machine Learning and Data Mining. Principal Component Analysis Fall 2016

Photo by Carl Warner

Unsupervised Learning : Clustering

Perception. Autonomous Mobile Robots. Sensors Vision Uncertainties, Line extraction from laser scans. Autonomous Systems Lab. Zürich.

of Iterative Closest Point ICP algorithm to

Histograms. h(r k ) = n k. p(r k )= n k /NM. Histogram: number of times intensity level rk appears in the image

Supervised vs unsupervised clustering

Lecture 9: Support Vector Machines

Algorithm research of 3D point cloud registration based on iterative closest point 1

Scanning Real World Objects without Worries 3D Reconstruction

INF555. Fundamentals of 3D. Lecture 4: Debriefing: ICP (kd-trees) Homography Graphics pipeline. Frank Nielsen

MACHINE LEARNING: CLUSTERING, AND CLASSIFICATION. Steve Tjoa June 25, 2014

APPENDIX: DETAILS ABOUT THE DISTANCE TRANSFORM

#$ % $ $& "$%% " $ '$ " '

Hartley - Zisserman reading club. Part I: Hartley and Zisserman Appendix 6: Part II: Zhengyou Zhang: Presented by Daniel Fontijne

Vision par ordinateur

Machine Learning. Topic 5: Linear Discriminants. Bryan Pardo, EECS 349 Machine Learning, 2013

Computer Vision I - Filtering and Feature detection

Clustering web search results

Lecture 3 January 22

The Curse of Dimensionality

Cluster analysis formalism, algorithms. Department of Cybernetics, Czech Technical University in Prague.

Non-rigid point set registration: Coherent Point Drift

Clustering CS 550: Machine Learning

Clustering: Classic Methods and Modern Views

Computer Vision I - Basics of Image Processing Part 1

Clustering Color/Intensity. Group together pixels of similar color/intensity.

Clustering & Dimensionality Reduction. 273A Intro Machine Learning

Clustering. Supervised vs. Unsupervised Learning

CS839: Probabilistic Graphical Models. Lecture 10: Learning with Partially Observed Data. Theo Rekatsinas

Grundlagen der Künstlichen Intelligenz

10701 Machine Learning. Clustering

Metric Learning for Large Scale Image Classification:

10. MLSP intro. (Clustering: K-means, EM, GMM, etc.)

K-Means Clustering 3/3/17

Image Registration Lecture 4: First Examples

Geometric Registration for Deformable Shapes 3.3 Advanced Global Matching

Learning from Data Mixture Models

New Approaches for EEG Source Localization and Dipole Moment Estimation. Shun Chi Wu, Yuchen Yao, A. Lee Swindlehurst University of California Irvine

Assignment 2. Unsupervised & Probabilistic Learning. Maneesh Sahani Due: Monday Nov 5, 2018

The exam is closed book, closed notes except your one-page cheat sheet.

Homework #4 Programming Assignment Due: 11:59 pm, November 4, 2018

Fisher vector image representation

Lecture 8: Registration

3D Environment Reconstruction

Clustering. Image segmentation, document clustering, protein class discovery, compression

Application of Principal Components Analysis and Gaussian Mixture Models to Printer Identification

Inference and Representation

Structured Light II. Thanks to Ronen Gvili, Szymon Rusinkiewicz and Maks Ovsjanikov

STATS306B STATS306B. Clustering. Jonathan Taylor Department of Statistics Stanford University. June 3, 2010

Missing variable problems

Image Segmentation using Gaussian Mixture Models

CSC 411: Lecture 12: Clustering

Transcription:

Three-Dimensional Sensors Lecture 6: Point-Cloud Registration Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inria.fr http://perception.inrialpes.fr/

Point-Cloud Registration Methods Fuse data gathered with several 3D sensors or with a moving 3D sensor. Each sensor provides a point cloud in a sensor-centered coordinate frame. There is only a partial overlap between the two point clouds. The data are corrupted both by noise and by outliers (or gross errors). The two point clouds must be registered or represented in the same coordinate frame. The registration process requires point-to-point correspondences which is a difficult problem.

Estimating the Transformation Between Two Point Clouds We start by assuming that the two point clouds have an equal number of points and that the point-to-point correspondences are known in advance. Notations: Each point cloud has N points {X n } n=n n=1 and {Y n } n=n n=1. X and Y are the centers of the two point clouds. X n = X n X and Ŷ n = Y n Y are the centered coordinates. The transformation between the two clouds is determined by a scale factor s, a rotation matrix R and a translation vector t. A point in one cloud is the transformed of its corresponding point in the other cloud: Y n = srx n + t

Least-Square Minimization The transformation parameters can be determined by minimization of: E = N Y n srx n t 2 n=1 Let s develop the quadratic error E (using R R = I): E = (Y n srx n t) (Y n srx n t) ( ) ( ) = s 2 X n X n + 2s X n R t n n ( ) ( ) 2s X n R Y n 2 Y n t + n n n Y n Y n + Nt t

The Optimal Translation t = argmin t E The derivative of the error with respect to t writes: de dt = 2sR n X n 2 n Y n + 2Nt Setting the derivative with respect to the translation vector equal to 0 t = Y srx

A Simplified Criterion By substitution of the optimal translation in the error function, we obtain: N E = Ŷ n sr X n 2 n=1 ( ) ( ) = s 2 X X n n 2s X n R Ŷ n + n n n Ŷ n Ŷ n

The Optimal Scale Factor s = argmin s E Which yields: s = n (R X n ) Ŷ n n X n X n Using a symmetric treatment of the scale factor [Horn 87] shows that the error: E = N ( s) 1 Ŷ n sr X n 2 n=1 yields the following solution: s = n Ŷ n Ŷ n n X n X n

The Optimal Rotation From the expansion of the error function we obtain that: ( ) R = argmin E = argmax X n R Ŷ n R R and: Q = n Ŷ ( n R X n = tr R X n Ŷ ) n n = tr(rc XY ) Solution: Let C XY = USV be the singular value decomposition (SVD) of the data co-variance matrix. The optimal rotation matrix is (see [ArunHuangBlostein 87] for more details): R = VU

Point Registration We assume now that the point-to-point correspondences are not known. The minimization criterion becomes: E = M m=1 n=1 N α mn Y m srx n t 2 Possible representations for the assignment variables α mn : 1 Binary assignment variables: α mn = 1 if Y m is in correspondence with X n and α mn = 0 otherwise. 2 Soft assignment variables: They are subject to the constraints m α mn = n α mn = 1, 0 α mn 1. 3 Probabilistic assignment variables: α mn = P (Y m = n Y m ) or the posterior probability of assigning point m to the point n.

The ICP Algorithm This is the most used point-registration method: iterative closest point (ICP) proposed by [BeslMcKay 92]: 1 Initialize the scale, rotation and translation parameters; 2 Transform each point X n and search its closest point Y k = argmin m Y m srx n t 2 and set α kn = 1; 3 Estimate the parameters minimizing: E = M α nk Y k srx n t 2 k=1 4 Repeat steps 2 and 3 until convergence (the parameters do not change any more).

Drawbacks of ICP It is very sensitive to the initial transformation parameters (it is trapped in a local solution). It is not robust to gross errors in the data. It cannot deal with anisotropic noise (the noise has the same variance along all the coordinates).

Accelerating ICP If the point clouds contain a large number of points, each iteration of the algorithm is rather slow. One can either sample the data, or better, apply K-means clustering to each point cloud. The cluster centers are then used in ICP. Step 2 is the most time-consuming process of the algorithm and needs an efficient implementation of searching for the nearest point. One can use KD-trees to structure one set of points as a binary tree and to rapidly perform a query into this tree in order to retrieve the closest point. Many variants of ICP were proposed in the vision/graphics literature.

Probabilistic Point Registration The minimization criterion is now generalized to: E = M m=1 n=1 N ( α mn Y m srx n t 2 C n + log C n ) The posterior probabilities are defined by: α mn = C n 1 2 exp ( 1 2 Y m srx n t 2 C n ) N k=1 C k 1 2 exp ( 1 2 Y m srx k t 2 C k ) + 3D Mahalanobis distance: a b 2 C = (a b) C 1 (a b)

Point Registration with Expectation-Maximization (EM) 1 Initialize the transformation parameters s, R, t and the covariance matrices {C n } N n=1. 2 Estimate the posterior probabilities {α mn } m=m,n=n m=1,n=1. 3 Estimate the transformation parameters s, R, t. 4 Estimate the covariance matrices. 5 Repeat steps 2, 3, and 4 until convergence

Discussion The EM algorithm for point registration uses a Gaussian mixture model (GMM) augmented with a uniform-distribution component that captures the outliers. The estimation of the transformation parameters do not need explicit point-to-point correspondences. However, such correspondences can be easily established using: if α mk 0.5 accept the assignment Y m X k with α mk = argmax n α mn. The algorithm is robust to initialization, Gaussian noise and uniformly distributed gross errors (outliers), it may however be trapped in local minima. It is less efficient than ICP because of the computation of the probabilities α mn. See [Horaud et al 2011] for more details.

An Example The first set: 15 points (blue circles) The second set: 25 points of which 15 inliers corrupted with Gaussian noise (green squares) and 10 outliers drawn from a uniform distribution (red squares). Initialization: R = I and t = 0. Result: 12 point-to-point correspondences found correctly and 7 point-to-outlier assignments.

The 2-nd Iteration

The 6-th Iteration

The 35-th and Final Iteration

Projects Using 3-D Sensors and Point Registration The Great Budha project (University of Tokyo): http://www.cvl.iis.u-tokyo.ac.jp/greatbuddha.html KinectFusion project: http://research.microsoft.com/ en-us/projects/surfacerecon/ The MixCam project: http://perception.inrialpes.fr/mixcam/index.html