Deformable Object Registration

Size: px
Start display at page:

Download "Deformable Object Registration"

Transcription

1 Aalto University Of Electrical Engineering AS Project in Automation and Systems Technology Deformable Object Registration Author: AnttiValkonen Instructor: Joni Pajarinen May 31, 2015

2 1 Introduction The problem of point set registration is to find a mapping between moving model point set M and a fixed scene set S. Mappings may represent either rigid or non-rigid transformations. Depending on the application, the point sets can be raw data from 3D-scanner or extracted feature points. Several different approaches have been developed for point set registration. Different methods are used depending on the type of input data and transformation model. If the two point sets have a similar structure, simple iterative methods exist for rigid point set registration. Often, however, the point sets are different in size, and one-to-one correspondence cannot be assumed. Probabilistic models have been proposed to allow for fuzzy correspondences between point sets. Sometimes the point sets are represented as a mixture of multivariate gaussian distributions, and the problem of matching discrete point sets becomes a problem of matching continuous distributions. The goal of this project is to register objects from point cloud data. A model point set from the object is constructed beforehand, and it is matched to a scene point set, in which the object is (partially) shown. The construction of point clouds is not part of the project. Three different algorithms are used for point set registration: Iterative Closest Point, Coherent Point Drift and Robust Point Set Registration Using Gaussian Mixture Models. Iterative Closest Point (ICP) is a simple iterative algorithm for point set registration. The classical ICP has shortcomings when dealing with imperfect data, but various modifications have been presented to overcome them. The two other algorithms model the point sets with mixtures gaussians and find the solution to problem by fitting the mixture distributions. As a result of this project, a c++ program is created to register objects. The actual implementation of the used algorithms is not part of this project, and the source code has been provided by the authors of [6]. 1

3 2 Background 2.1 Mixture Models Mixture models are probabilistic models to represent data. A mixture model assumes that all observations (or measurements) are generated from a mixture of probability distributions with unknown parameters θ. A probability density function of such model is where: p(x θ) = N α n p(x θ k ) (1) n=1 K = Number of mixture components N = Number of observations α = Vector of all mixture weights, sums to 1 α k = Weight parameter associated with micture component k θ i=1..k = Parameter of distribution associated with component i θ = Vector of all parameters θ p(x θ k ) = Probability density of distribution with parameters θ k Gaussian mixture model (GMM) is a mixture of K gaussian densities with parameters µ k, Σ k. With K gaussian distributions, p(x θ k ) is p(x µ x, Σ x ) = 1 det(2πσx ) exp{ 1 2 (x µ x) T Σ 1 (x µ x )} (2) Σ x = Covariance matrix of mixture component x µ x = Mean of mixture component x When fitting the distribution parameters θ to a given data set X, it is useful to have equation 1 as function of θ. Likelihood-function can be obtained by using component weights as priorprobabilities. The density of the likelihood of observations X given parameters θ is p(x θ) = The goal is to find θ which maximizes L, or N p(x i θ) = L(θ X ) (3) i=1 θ = arg max L(θ X ) (4) θ The difficulty of the problem depends on the form of p(θ X ). Often the log-likelihood is maximized, because it simplifies the computation without changing the maxima of the function. [4] Expectation Maximization algorithm Expectation Maximization (EM) is an iterative method for finding the maximum likelihood of θ given X. EM assumes that the data set X is incomplete, and a complete data Z = (X, Y) exists, with hidden data points Y. Also, a join density p(z θ) = p(x, y θ) = p(y x, θ)p(x θ) (5) is assumed. With equation 5, a likelihood function for complete data Z is defined as L(θ Z) = p(x, Y θ) (6) 2

4 Figure 1: Training a mixture of 10 Gaussian with EM algorithm. In (a), all parameters are set to have a large variances and a same mean, in (b) and (c), the means are starting to converge according to the training data. In (d) the converged solution is shown. [2, p. 406] EM algorithm first solves the expected value of a log likelihood with respect to unobserved data points, given the observed data X and parameters θ. This is called the E-step of the algorithm. where: Q(θ, θ (i 1 ) = E[log p(x, Y θ) X, θ (i 1 )] = log p(x, y θ)p(y X, θ (i 1) )dy y Y X = Known data points, constant θ (i 1) = Parameters θ from the previous iteration, constant Y = Unobserved data points, a random variable dictated by distribution p(y X, θ (i 1) ) θ = is the variable which is to be adjusted. (7) The second step is to maximize the previously computed expectation: θ = arg max Q(θ, θ (i 1) ) (8) θ When maximising the log-likelihood of a gaussian mixture model, the complete data log likelihood (7) gets form N H n=1 i=1 p old (i h){ (xn m i ) T Σ 1 i (x n m i ) 1 2 log det(2πσ i) + log p(i)} (9) and the computations needed for E and M steps are as shown below. E-step γ(z nk ) = α k N (x n µ k, Σ k ) K j=1 α jn (x n µ j, Σ j ) 3

5 M-step µ new k = 1 N k Σ new N γ(z nk )x n n=1 k = 1 γ(z nk )(x n µ new k )(x n µ new k ) T N k α new k = N k N E-step and M-step are iterated until the algorithm converges. [2, p ] 2.2 Transformation Models The point-set registration problem is to find a mapping T which maps the points S so that the error measure between point sets T (S) and M is minimized. The methods used in this project use the rigid transformation model and two types of non-rigid transformation models: thin plate splines and gaussian radial functions Rigid transformation A rigid transformation can be considered as a combination of translation, rotation, and uniform scaling. The resulting transformation has 7 parameters. Assuming 2 pairwise point sets M and S, translation can be obtained by computing centroid Center(X ) = N n=1 xi N from both point sets and computing X centred = X Center(X ) (10) To remove the scale difference, the scale of the (centred) point sets is measured by computing the average distance size(x ) = from the origin of point sets S to scale of M and computing X N S scaled = size(m) size(s) S (11) To remove the relative rotation, the rotation matrix R can be found by minimizing square distance of point-wise distance [1]. Usually, the point sets are non-pairwise, and scale factor s, translation T and rotation R have to be found differently. In section 3, non-pairwise point set registration is considered in more detail Non-Rigid Transformation In addition to the rigid transformation parameters, a non-rigid transformation models the deformation between the point sets. Non-Rigid transformations are often modelled by using radial basis functions (RBFs). An RBF maps a point x to location y using the function where: y = N w i φ(x, p i ) (12) i=1 w i = i-rh warping coefficient (d-dimensional vector) p i = i-th control point (d-dimensional vector) φ(r) = kernel function r = euclidean distance between the two points Common choices for kernel functions are thin plate splines (TPSs) and gaussian radial basis functions (GRBFs). A TPS has a kernel function φ(r) = r 2 log(r) in two dimensions, and φ(r) = r 4

6 Figure 2: The top row represents an unknown deformation of an image where the mapping of the control points in known, and the bottom row represents a thin plate spline deformation with the same control points in three dimensions. A Gaussian kernel is defined as exp( r2 2σ 2 ). Gaussian radial function has a free parameter σ, which represents the variance of the kernel. Given two point sets x and y, a control point set in point set x and a corresponding set in y, the deformation between the point sets can be computed. This is illustrated in Figure 2. In point set registration, however, the location of the control points is only known in one of the point sets, and the task is to find the mapping for each control point so that the error metric between the point sets is minimized. Therefore, the non-rigid transformation has d p additional parameters to non-rigid case, where d is the dimension of the point set, and p is the amount of the control points. An important aspect of the non-rigid mapping is the notation of bending energy. When computing non-rigid point set registration, the algorithms should prefer transformations with little deformation, and penalize those with a larger or less credible deformation. The bending energy of a non-linear mapping with radial functions is the following: Let Q be a matrix representing a set of p control points and kernel matrix K = {K ij } where K ij = φ(p i, p j ). Given the warping coefficients w, the bending energy of the deformation is E bending = trace(w T Kw) (13) To regularize a non-linear transformation, a penalty term λ 2 (wt Kw) can be added to the cost function. Lambda is a free parameter which dictates the amount of deformation allowed. 3 Algorithms Registration algorithms used in this project are described in this section: Iterative Closest Point[3], which is the most basic algorithm, is used because of its simplicity in implementation and computation. Methods based on gaussian mixture models, described in papers [6] and [7] are described in sections and

7 3.1 Iterative Closest Point Iterative Closest Point (ICP) [3] is one of the best-known algorithms used in point set registration. ICP automatically finds correspondences between two point sets, and uses these correspondences to estimate the transformation. Algorithm Outline Different ICP versions find the corresponding points differently. Early versions find points with shortest point-to-point -distance, but later versions use point-to-surface distance. A point-tosurface distance is defined as the distance between the point m and the surface plane that passes by corresponding point s in scene-set. The ICP implementation used in this project uses point-topoint -distances. Until a stopping criterion is met: 1. Find a set of corresponding points between the point sets. 2. Estimate the rigid transformation matrix T using a closed form solution described in [1]. 3. Apply the transformation T to the model point set 3.2 Point Set Registration With Gaussian Mixture Models To overcome the limitations of the Iterative Closest Point method, various probabilistic methods have been developed to solve the point set registration problem. Modelling point sets as mixtures of gaussians gives flexibility to the algorithms: rather than comparing individual points between the scene and the model, the alignment is conducted for probability distributions. This results in a better performance, especially in presence of noise and outliers. In the following two methods, the gaussian distributions are slightly simplified versions of those described in section 2.1. All the mixture weights are equal, and the covariances of all member distributions are σ 2 I Robust Point Set Registration Using Gaussian Mixture Models Jian et al. [6] presented a probabilistic framework for registering point sets with both rigid and different non-rigid transformation models. A gaussian mixture model is constructed from both scene- and model point set, and a numerical optimization is computed to minimize the L2-distance between the two GMMs. For non-rigid transformations, two different deformation models were used: thin plate splines and gaussian radial basis functions. Cost Function A similarity-measure L2-distance (d L2 ) is used to measure similarity between different mixture models. L2-distance has a closed-form expression for two gaussian mixtures, and also its gradient has a closed-form expression. This allows the use of quasi-newton algorithm to find d L 2. The L2-distance between the 2 GMMs generated from point sets is defined as d L2 (S, M, θ) = (g f) 2 dx (14) = f 2 dx 2 fgdx + g 2 dx where: f = gmm(t (M, θ)) = The probability density of the gaussian mixture generated from the model point set and undergone a transformation T with parameters θ g = gmm(s) = The probability density of the gaussian mixture generated from the scene point set 6

8 Since the scene point set is fixed during the optimization, only the two latter terms have an effect on the shape of the cost function. Algorithm Outline 1. Construct gaussian mixture models from point sets M, S, set initial parameters θ for transform T, and initial scale parameter σ. 2. Repeat phases 3 5 until stopping criteria is met 3. Minimize d L2 (S, M, θ) with quasi-newton optimization engine. A regularization term must be added for non-rigid transformation families. 4. Update the new parameter θ. 5. Decrease the scale parameter σ as annealing step Coherent Point Drift A similar point set registration algorithm called Coherent Point Drift, CPD was described in [7]. While the method in considers both point sets as mixtures of gaussians, CPD considers the model point set as a gaussian mixture, and the scene point set as a random sample drawn from the distribution generated from the model point set. In CPD, outliers are modelled as a uniform distribution, whose weight can be adjusted when running the algorithm. The method uses an Expectation Maximization framework to find a transformation T which maximizes the expectation of scene points being generated by the gaussian mixture, which is generated from point set T (M). The objective function (7) is a function of transformation parameters θ and variances σ, and the update equations depend on the type of transformation used. For non-rigid registration, gaussian radial basis functions were used. Therefore, additional free parameters for the non-rigid registration are the weight of the bending energy λ and the variance of the gaussian kernel. 4 Software One of the results of this project is the software created for point cloud registration. While the code for the registration algorithms is provided, the created software is needed to handle data conversion between different algorithms, and analysing and visualizing results. The software is written in C++ using Windows 8.1 platform and Microsoft Visual Studio Libraries Point Cloud Library (PCL) is an open-source library for point cloud processing. It contains algorithms for various computer vision tasks, and it has been written C++. In this project, the point sets are represented as PCL::PointCloud objects, and the visualization is done with functions within PCL. Dependencies for PCL, such as Boost, Eigen for linear algebra, VTK and QT for visualization. Vision Numerics Library(VNL) is a library for numerical calculation of algorithms in computer vision. gmmreg api is a library that computes point set registration with gaussian mixture models. The library implements the methods described in 3.2. The original sources can be found in [5]. The original version takes a text file and a registration method as input arguments. The input text file contains the configuration parameters for the methods and a path for the scene, model and 7

9 control point sets (which are also.txt files). The output of the original gmmreg api is a text file containing the transformed point set. The modified gmmreg api takes the input point sets, a control point set and the registration method as arguments, and returns the transformed point set as VNLMatrix. 4.2 Program Structure The structure of the program is the following: six classes were created to conduct the point set registration. Modified code for GMM registration was built as a static library and included to the project. RegistrationManager Class RegistrationManager has an instance of all the other classes as members. In the main function, a new RegistrationManager is created. Then, a member function RegistrationManager::InitalizePointClouds is called, which calls member function PointLoader::Load for the point clouds that were given as arguments for the program. Then, a member function Registration- Manager::InitalizeAlgorithm is called, which creates a RegistrationAlorithm object according to the input arguments passed to the program. Then, the algorithm is run with the function RegistrationManager::RunAlorithm, which runs the created algorithm for the point clouds that were loaded earlier. Finally, RegistrationManager::VisualizeResults is called to visualize the results of the registration. PointLoader Class PointLoader is responsible for the loading and the saving of the point clouds. PointLoader can load and save point clouds in file formats.pcd and.txt. RegistrationAlgorithm Class RegistrationAlgorithm is a base class for different types of RegistrationAlgorithm objects. It has member functions for centering and downsampling the point clouds, and abstract functions Run and PreProcessPointClouds, which depend on the type of RegistrationAlgorithm. GMM algorithm Class GMM algorithm represents the RegistrationAlgorithm for the methods in the gmmreg api (EM TPS, EM GRBF, rigid, TPS L2, and TPS GRBF). The implementation for the PreProcess- PointClouds is the following: point clouds are downsampled and centered, a set of control points is generated (unless the rigid method is used), and VNLMatrices are created from the PointCloudobjects. Member function Run simply calls the gmmreg api with the VNLmatrices, control points and the config-file that is associated with the method. ICP algorithm Class ICP algorithm represents the RegistrationAlgorithm for the method ICP. The member function PreProcessPointClouds downsamples and centers the point clouds, and the member function Run runs the registration algorithm (which is implemented in Point Cloud Library) and transforms the model cloud with the parameters that the ICP algorithm computed. Visualizer Class Visualizer takes care of visualizing the results by using the class pcl visualizer of Point- CloudLibrary. The member function SetPointClouds sets a vector of point clouds to the stage. Run simply runs the visualizer. The point clouds on stage can be toggled on and off by pressing the keyboard that corresponds to the point cloud (e.g. pressing button 1 toggles the first element of the vector passed to SetPointClouds). 8

10 Figure 3: Visualization of the model point sets m505, m503 and m506 Figure 4: Visualization of the scene point sets yellsmall mug, green mug and yellow mug 4.3 Instructions For Running The Software The program takes three input arguments, Scene, Model, and Method. The point sets can be in file format.txt or.pcd. Valid Methods are: ICP: Iterative Closest Point, described in [3]. rigid: Rigid registration with GMMs, described in [6], uses the error L2 distance as an error measure. Corresponding configuration file is parameters rigid.ini TPS L2: Non-rigid registration with GMMs, described in [6], uses thin-plate spline deformations and L2 distance. Corresponding configuration file is parameters TPS L2.ini GRBF L2: Non-rigid registration with GMMs, described in [6], uses gaussian radial basis function deformations and L2 distance. Corresponding configuration file is parameters GRBF L2.ini EM GRBF: Non-rigid registration with GMMs, described in [7], uses gaussian radial basis function deformations and EM-framework for transformation optimization. Corresponding configuration file is parameters EM GRBF.ini The input files, configuration files and the files containing the data sets should be in the same directory as the executable. The following example command executes the program and computes a non-rigid registration with L2-distance and thin-plate splines../deformable object registration scene.pcd model.pcd TPS L2 Deformable object registration.exe scene.pcd model.pcd TPS L2 (linux) (windows) 5 Experiments This section demonstrates the performance of the registration methods discussed in section 3. First, simple two-dimensional test cases are considered, and then, two test cases with real-world-data are considered. 9

11 Figure 5: The results of the Iterative Closest Point algorithm for rotated point sets. The results for a rotation of pi/8 are shown on the first row, and the results for a rotation of pi/4 are shown on the second row. 5.1 Generated Experiments In the first test for the algorithms, a two-dimensional point set was rotated, and the registration algorithms were used to find the transformation between the rotated and the original point set. Iterative Closest Point yielded good results as long as the rotation was sufficiently small. When rotation was pi/8, the result was accurate, but the algorithm converged to local minimum when the rotation was pi/4 or larger. The results are presented in Figure 5. All the GMM methods resulted in accurate results with any given angle. The second test was similar to the first one, except for some of the points which were removed from the scene point set. Iterative Closest Point worked in a similar fashion as in the first test: the correct rotation was found with the rotation angle of pi/8, and the algorithm converged to a local minimum when the rotation angle was larger. Rigid GMM method (L2-distance method) yielded accurate results with any rotation angle. The results from the non-rigid GMM method were dependent on the input parameters given to the algorithm. When the bending energy of the deformation was not penalized enough, the algorithms resulted in overfitted transformations. An overfitted transformation is presented in Figure Experiments with real-world data The dataset used in these experiments were created with Microsoft Kinect. Model point sets were created by moving the Kinect sensor around the object, and scene point sets were created by extracting the object from a single range image taken with Kinect. Model point sets were 10

12 Figure 6: The overfitted transformation, computed with L2-method and thin-plate splines. The registration yielded better results when the bending energy of the transformation was penalized. considerably larger by having around 10,000 50,000 points, while scene point sets have around 2,000 3,000 points. Examples of model point sets are shown in Figure 3, and examples of scene point sets are shown in Figure 4. When running the algorithms, the point sets were downsampled to roughly 1,000 points each. For the first test, coffee cups with similar shapes were chosen. m503 was chosen as the model point set, and yellsmall mug was chosen as the scene point set. The point sets are presented in Figures 3 and 4. Iterative closest point did not work at all. The algorithm was trapped to a local minimum, and the transformed model point set was really similar to the original one. The rigid GMM method found a reasonably good solution. The overall alignment was good, except for the handles of the mugs did not align. The authors of [6] point out that the local minima -issue exists when the point sets have symmetries or repeating patterns in their spatial structure. The non-rigid GMM methods worked as well as the rigid method when the regularization parameters were set to reasonable values. When the bending energy was not penalized, similar overfitting occurred as in Figure 6. For the second and final test, mugs with considerably different shape were chosen. m506 was chosen as the model point set, an red2 mug was used as scene point set. The results of the ICP were similar to the previous case: ICP could not handle the large number of outliers, and the local minimum to which the algorithm converged was close to the identity transform. The rigid GMM method did not work well in this case. The different shape of the coffee cups resulted in a solution where the bottom part of the model was fitted to the side of the scene point set. The results are Figure 7 The non-rigid GMM methods worked better with these point sets. With the resulting transformation, the model point set was stretched to align reasonably well with the scene point set, except for the handle. The results are shown in Figure 8 11

13 Figure 7: The rigid registration with GMMs was trapped in a local minimum when the data sets had considerably different shapes. Figure 8: The Non-rigid registration yielded good results when aligning two coffee cups of different shapes. 12

14 Work package Time reserved Time used 1. Get an overview of the different methods 20h 20h 2. Pre-study of the techniques 60h 50h 3. Compiling and running the code 50h 105h 4. Running with the datasets provided by school 20h 5h 5. Tweaking on the methods (parameters, approximations) 50h 30h 6. Documentation 16h 28h Total 216h 238h Table 1: A summary of time use: the total time used for the work packages For the non-rigid methods, a good set of parameters was found that worked reasonably well within the data set. Rather small set of control points ( 100) was found to be sufficient. Even though this value yielded good results within the data set provided, the size of the control point set depends on the nature of the deformation. Choosing the amount of data set is a trade-off between computing speed and accuracy. The deformation penalization coefficient λ yielded overfitted results when close to zero, and close to rigid results with large numbers. 6 Conclusions Clearly, by modelling point sets with gaussian mixture models, registration becomes robust compared to the basic ICP. Computing point set registration with discrete point sets is sensitive to outliers and noise, and both of these can be modelled with mixture models. Iterative Closest Point, however, is relatively simple to implement and it runs fast, and improved versions have been developed to overcome its problems. These methods are out of the scope of this project, and the results are based only on the basic implementation of ICP. Point set registration with mixtures of gaussians does have some of its own issues. All the methods require some tuning of free parameters, and while a set of parameters might work well with one data set, it may not work so well with another, different type of data set. Another issue with registration with gaussian mixtures is the speed: while the ICP computes a result relatively fast, the use of gaussian mixtures require extensive computation. A rigid GMM method had running times of 1 5 seconds and non-rigid methods seconds, when the data sets had around 1,000 points and around 100 control points were used. In order to make these methods faster, data sets can be down sampled as a pre-processing step, which can work well in some cases, but it adds another free tunable parameter. Also, some precision is lost during the down sampling of the data. Another consideration that was mentioned in [6] and [7] is the use of Fast Gaussian Transform during the computation of the methods. Two different deformation models were used to conduct non-rigid point set registration. Both GRBFs and TPSs provide a framework for modelling deformations via a set of control points. Personally, I could not see much difference in the results when comparing these two. GRBF has one more free parameter that has to be tuned, which in itself can make the tuning more difficult. 6.1 The Summary Of The Project Time management was quite optimistic during the whole project, and the work packages introduced in the project plan could have been more meaningful. Programming and tasks related to it, for example, took twice as much time as planned. The work package was called Compiling and running the [already provided] code, yet the final program consisted of six classes on top of the provided code. Table 6.1 summarizes the time use with the work packages. The total hours worked was recorded, but the division between the work packages is an estimation, since the work packages themselves were not defined accurately enough. The registration methods used in the project were chosen quite early on, and the two chosen 13

15 methods were quite similar. It could have been interesting to study and use methods more different from one another. On the other hand, the subject was quite difficult and the methods used a lot of machine learning and linear algebra. Understanding the ideas behind the chosen methods was already a great challenge. The project plan states the following about the goal of the project: The goal of the project is to make a robot recognize objects in its environment using state-of-the-art algorithms. Several approaches have been used to solve the problem, and in this project, different methods will be compared in regards of computing demands and performance. The methods were chosen, a piece of software was programmed and point sets were registered, so the project was successful in my opinion. 14

16 References [1] K Somani Arun, Thomas S Huang, and Steven D Blostein. Least-squares fitting of two 3-d point sets. Pattern Analysis and Machine Intelligence, IEEE Transactions on, (5): , [2] David Barber. Bayesian reasoning and machine learning. Cambridge University Press, [3] Paul J Besl and Neil D McKay. Method for registration of 3-d shapes. In Robotics-DL tentative, pages International Society for Optics and Photonics, [4] Jeff A Bilmes et al. A gentle tutorial of the em algorithm and its application to parameter estimation for gaussian mixture and hidden markov models. International Computer Science Institute, 4(510):126, [5] Bing Jian and Baba C. Vemuri. Repository for the implementations of the robust point set registration algorithm. [6] Bing Jian and Baba C Vemuri. Robust point set registration using gaussian mixture models. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 33(8): , [7] Andriy Myronenko and Xubo Song. Point set registration: Coherent point drift. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 32(12): ,

Note Set 4: Finite Mixture Models and the EM Algorithm

Note Set 4: Finite Mixture Models and the EM Algorithm Note Set 4: Finite Mixture Models and the EM Algorithm Padhraic Smyth, Department of Computer Science University of California, Irvine Finite Mixture Models A finite mixture model with K components, for

More information

Three-Dimensional Sensors Lecture 6: Point-Cloud Registration

Three-Dimensional Sensors Lecture 6: Point-Cloud Registration Three-Dimensional Sensors Lecture 6: Point-Cloud Registration Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inria.fr http://perception.inrialpes.fr/ Point-Cloud Registration Methods Fuse data

More information

Mixture Models and the EM Algorithm

Mixture Models and the EM Algorithm Mixture Models and the EM Algorithm Padhraic Smyth, Department of Computer Science University of California, Irvine c 2017 1 Finite Mixture Models Say we have a data set D = {x 1,..., x N } where x i is

More information

Non-rigid point set registration: Coherent Point Drift

Non-rigid point set registration: Coherent Point Drift Non-rigid point set registration: Coherent Point Drift Andriy Myronenko Xubo Song Miguel Á. Carreira-Perpiñán Department of Computer Science and Electrical Engineering OGI School of Science and Engineering

More information

Scan Matching. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Scan Matching. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Scan Matching Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Scan Matching Overview Problem statement: Given a scan and a map, or a scan and a scan,

More information

Machine Learning. B. Unsupervised Learning B.1 Cluster Analysis. Lars Schmidt-Thieme, Nicolas Schilling

Machine Learning. B. Unsupervised Learning B.1 Cluster Analysis. Lars Schmidt-Thieme, Nicolas Schilling Machine Learning B. Unsupervised Learning B.1 Cluster Analysis Lars Schmidt-Thieme, Nicolas Schilling Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim,

More information

CS839: Probabilistic Graphical Models. Lecture 10: Learning with Partially Observed Data. Theo Rekatsinas

CS839: Probabilistic Graphical Models. Lecture 10: Learning with Partially Observed Data. Theo Rekatsinas CS839: Probabilistic Graphical Models Lecture 10: Learning with Partially Observed Data Theo Rekatsinas 1 Partially Observed GMs Speech recognition 2 Partially Observed GMs Evolution 3 Partially Observed

More information

Machine Learning. B. Unsupervised Learning B.1 Cluster Analysis. Lars Schmidt-Thieme

Machine Learning. B. Unsupervised Learning B.1 Cluster Analysis. Lars Schmidt-Thieme Machine Learning B. Unsupervised Learning B.1 Cluster Analysis Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany

More information

CS Introduction to Data Mining Instructor: Abdullah Mueen

CS Introduction to Data Mining Instructor: Abdullah Mueen CS 591.03 Introduction to Data Mining Instructor: Abdullah Mueen LECTURE 8: ADVANCED CLUSTERING (FUZZY AND CO -CLUSTERING) Review: Basic Cluster Analysis Methods (Chap. 10) Cluster Analysis: Basic Concepts

More information

Image Warping. Srikumar Ramalingam School of Computing University of Utah. [Slides borrowed from Ross Whitaker] 1

Image Warping. Srikumar Ramalingam School of Computing University of Utah. [Slides borrowed from Ross Whitaker] 1 Image Warping Srikumar Ramalingam School of Computing University of Utah [Slides borrowed from Ross Whitaker] 1 Geom Trans: Distortion From Optics Barrel Distortion Pincushion Distortion Straight lines

More information

Robust Rigid Point Registration based on Convolution of Adaptive Gaussian Mixture Models

Robust Rigid Point Registration based on Convolution of Adaptive Gaussian Mixture Models Robust Rigid Point Registration based on Convolution of Adaptive Gaussian Mixture Models Can Pu *, Nanbo Li *, Robert B Fisher * * University of Edinburgh {canpu999, linanbo2008}@gmail.com rbf@inf.ed.ac.uk

More information

Rotation Invariant Image Registration using Robust Shape Matching

Rotation Invariant Image Registration using Robust Shape Matching International Journal of Electronic and Electrical Engineering. ISSN 0974-2174 Volume 7, Number 2 (2014), pp. 125-132 International Research Publication House http://www.irphouse.com Rotation Invariant

More information

MultiDimensional Signal Processing Master Degree in Ingegneria delle Telecomunicazioni A.A

MultiDimensional Signal Processing Master Degree in Ingegneria delle Telecomunicazioni A.A MultiDimensional Signal Processing Master Degree in Ingegneria delle Telecomunicazioni A.A. 205-206 Pietro Guccione, PhD DEI - DIPARTIMENTO DI INGEGNERIA ELETTRICA E DELL INFORMAZIONE POLITECNICO DI BARI

More information

An Introduction to PDF Estimation and Clustering

An Introduction to PDF Estimation and Clustering Sigmedia, Electronic Engineering Dept., Trinity College, Dublin. 1 An Introduction to PDF Estimation and Clustering David Corrigan corrigad@tcd.ie Electrical and Electronic Engineering Dept., University

More information

Bi-Stage Large Point Set Registration Using Gaussian Mixture Models

Bi-Stage Large Point Set Registration Using Gaussian Mixture Models Bi-Stage Large Point Set Registration Using Gaussian Mixture Models Junfen Chen 1, Munir Zaman 2, Iman Yi Liao 3 and Bahari Belaton 1 cj11_com079@student.usm.my, munir.zaman@nottingham.edu.my, iman.liao@nottingham.edu.my,

More information

Unsupervised Learning

Unsupervised Learning Unsupervised Learning Learning without Class Labels (or correct outputs) Density Estimation Learn P(X) given training data for X Clustering Partition data into clusters Dimensionality Reduction Discover

More information

Surface Registration. Gianpaolo Palma

Surface Registration. Gianpaolo Palma Surface Registration Gianpaolo Palma The problem 3D scanning generates multiple range images Each contain 3D points for different parts of the model in the local coordinates of the scanner Find a rigid

More information

Geometric Registration for Deformable Shapes 3.3 Advanced Global Matching

Geometric Registration for Deformable Shapes 3.3 Advanced Global Matching Geometric Registration for Deformable Shapes 3.3 Advanced Global Matching Correlated Correspondences [ASP*04] A Complete Registration System [HAW*08] In this session Advanced Global Matching Some practical

More information

#$ % $ $& "$%% " $ '$ " '

#$ % $ $& $%%  $ '$  ' ! " This section of the course covers techniques for pairwise (i.e., scanto-scan) and global (i.e., involving more than 2 scans) alignment, given that the algorithms are constrained to obtain a rigid-body

More information

Unsupervised Learning. Clustering and the EM Algorithm. Unsupervised Learning is Model Learning

Unsupervised Learning. Clustering and the EM Algorithm. Unsupervised Learning is Model Learning Unsupervised Learning Clustering and the EM Algorithm Susanna Ricco Supervised Learning Given data in the form < x, y >, y is the target to learn. Good news: Easy to tell if our algorithm is giving the

More information

Chapter 6 Continued: Partitioning Methods

Chapter 6 Continued: Partitioning Methods Chapter 6 Continued: Partitioning Methods Partitioning methods fix the number of clusters k and seek the best possible partition for that k. The goal is to choose the partition which gives the optimal

More information

Clustering Lecture 5: Mixture Model

Clustering Lecture 5: Mixture Model Clustering Lecture 5: Mixture Model Jing Gao SUNY Buffalo 1 Outline Basics Motivation, definition, evaluation Methods Partitional Hierarchical Density-based Mixture model Spectral methods Advanced topics

More information

Machine Learning. Unsupervised Learning. Manfred Huber

Machine Learning. Unsupervised Learning. Manfred Huber Machine Learning Unsupervised Learning Manfred Huber 2015 1 Unsupervised Learning In supervised learning the training data provides desired target output for learning In unsupervised learning the training

More information

Algorithm research of 3D point cloud registration based on iterative closest point 1

Algorithm research of 3D point cloud registration based on iterative closest point 1 Acta Technica 62, No. 3B/2017, 189 196 c 2017 Institute of Thermomechanics CAS, v.v.i. Algorithm research of 3D point cloud registration based on iterative closest point 1 Qian Gao 2, Yujian Wang 2,3,

More information

Machine Learning and Data Mining. Clustering (1): Basics. Kalev Kask

Machine Learning and Data Mining. Clustering (1): Basics. Kalev Kask Machine Learning and Data Mining Clustering (1): Basics Kalev Kask Unsupervised learning Supervised learning Predict target value ( y ) given features ( x ) Unsupervised learning Understand patterns of

More information

K-Means and Gaussian Mixture Models

K-Means and Gaussian Mixture Models K-Means and Gaussian Mixture Models David Rosenberg New York University June 15, 2015 David Rosenberg (New York University) DS-GA 1003 June 15, 2015 1 / 43 K-Means Clustering Example: Old Faithful Geyser

More information

Big Data Infrastructure CS 489/698 Big Data Infrastructure (Winter 2017)

Big Data Infrastructure CS 489/698 Big Data Infrastructure (Winter 2017) Big Data Infrastructure CS 489/698 Big Data Infrastructure (Winter 2017) Week 9: Data Mining (4/4) March 9, 2017 Jimmy Lin David R. Cheriton School of Computer Science University of Waterloo These slides

More information

An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework

An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework IEEE SIGNAL PROCESSING LETTERS, VOL. XX, NO. XX, XXX 23 An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework Ji Won Yoon arxiv:37.99v [cs.lg] 3 Jul 23 Abstract In order to cluster

More information

Machine Learning A W 1sst KU. b) [1 P] Give an example for a probability distributions P (A, B, C) that disproves

Machine Learning A W 1sst KU. b) [1 P] Give an example for a probability distributions P (A, B, C) that disproves Machine Learning A 708.064 11W 1sst KU Exercises Problems marked with * are optional. 1 Conditional Independence I [2 P] a) [1 P] Give an example for a probability distribution P (A, B, C) that disproves

More information

CIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, :59pm, PDF to Canvas [100 points]

CIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, :59pm, PDF to Canvas [100 points] CIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, 2015. 11:59pm, PDF to Canvas [100 points] Instructions. Please write up your responses to the following problems clearly and concisely.

More information

Data Preprocessing. Javier Béjar. URL - Spring 2018 CS - MAI 1/78 BY: $\

Data Preprocessing. Javier Béjar. URL - Spring 2018 CS - MAI 1/78 BY: $\ Data Preprocessing Javier Béjar BY: $\ URL - Spring 2018 C CS - MAI 1/78 Introduction Data representation Unstructured datasets: Examples described by a flat set of attributes: attribute-value matrix Structured

More information

ECG782: Multidimensional Digital Signal Processing

ECG782: Multidimensional Digital Signal Processing ECG782: Multidimensional Digital Signal Processing Object Recognition http://www.ee.unlv.edu/~b1morris/ecg782/ 2 Outline Knowledge Representation Statistical Pattern Recognition Neural Networks Boosting

More information

CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS

CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS CHAPTER 4 CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS 4.1 Introduction Optical character recognition is one of

More information

Robust Kernel Methods in Clustering and Dimensionality Reduction Problems

Robust Kernel Methods in Clustering and Dimensionality Reduction Problems Robust Kernel Methods in Clustering and Dimensionality Reduction Problems Jian Guo, Debadyuti Roy, Jing Wang University of Michigan, Department of Statistics Introduction In this report we propose robust

More information

Statistics 202: Data Mining. c Jonathan Taylor. Week 8 Based in part on slides from textbook, slides of Susan Holmes. December 2, / 1

Statistics 202: Data Mining. c Jonathan Taylor. Week 8 Based in part on slides from textbook, slides of Susan Holmes. December 2, / 1 Week 8 Based in part on slides from textbook, slides of Susan Holmes December 2, 2012 1 / 1 Part I Clustering 2 / 1 Clustering Clustering Goal: Finding groups of objects such that the objects in a group

More information

Unit 3 : Image Segmentation

Unit 3 : Image Segmentation Unit 3 : Image Segmentation K-means Clustering Mean Shift Segmentation Active Contour Models Snakes Normalized Cut Segmentation CS 6550 0 Histogram-based segmentation Goal Break the image into K regions

More information

Fundamental matrix estimation for binocular vision measuring system used in wild field

Fundamental matrix estimation for binocular vision measuring system used in wild field Fundamental matrix estimation for binocular vision measuring system used in wild field Yan Nian 12 and Wang Xiang-jun 12 and Liu Feng 12 1 State Key Laboratory of Precision Measuring Technology and Instruments,

More information

Normalized Texture Motifs and Their Application to Statistical Object Modeling

Normalized Texture Motifs and Their Application to Statistical Object Modeling Normalized Texture Motifs and Their Application to Statistical Obect Modeling S. D. Newsam B. S. Manunath Center for Applied Scientific Computing Electrical and Computer Engineering Lawrence Livermore

More information

What is machine learning?

What is machine learning? Machine learning, pattern recognition and statistical data modelling Lecture 12. The last lecture Coryn Bailer-Jones 1 What is machine learning? Data description and interpretation finding simpler relationship

More information

University of Florida CISE department Gator Engineering. Clustering Part 2

University of Florida CISE department Gator Engineering. Clustering Part 2 Clustering Part 2 Dr. Sanjay Ranka Professor Computer and Information Science and Engineering University of Florida, Gainesville Partitional Clustering Original Points A Partitional Clustering Hierarchical

More information

Clustering web search results

Clustering web search results Clustering K-means Machine Learning CSE546 Emily Fox University of Washington November 4, 2013 1 Clustering images Set of Images [Goldberger et al.] 2 1 Clustering web search results 3 Some Data 4 2 K-means

More information

Mixture Models and EM

Mixture Models and EM Mixture Models and EM Goal: Introduction to probabilistic mixture models and the expectationmaximization (EM) algorithm. Motivation: simultaneous fitting of multiple model instances unsupervised clustering

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Clustering and EM Barnabás Póczos & Aarti Singh Contents Clustering K-means Mixture of Gaussians Expectation Maximization Variational Methods 2 Clustering 3 K-

More information

Clustering and Visualisation of Data

Clustering and Visualisation of Data Clustering and Visualisation of Data Hiroshi Shimodaira January-March 28 Cluster analysis aims to partition a data set into meaningful or useful groups, based on distances between data points. In some

More information

Problem Set 4. Assigned: March 23, 2006 Due: April 17, (6.882) Belief Propagation for Segmentation

Problem Set 4. Assigned: March 23, 2006 Due: April 17, (6.882) Belief Propagation for Segmentation 6.098/6.882 Computational Photography 1 Problem Set 4 Assigned: March 23, 2006 Due: April 17, 2006 Problem 1 (6.882) Belief Propagation for Segmentation In this problem you will set-up a Markov Random

More information

Network Traffic Measurements and Analysis

Network Traffic Measurements and Analysis DEIB - Politecnico di Milano Fall, 2017 Introduction Often, we have only a set of features x = x 1, x 2,, x n, but no associated response y. Therefore we are not interested in prediction nor classification,

More information

Robust L 2 E Estimation of Transformation for Non-Rigid Registration

Robust L 2 E Estimation of Transformation for Non-Rigid Registration IEEE TRANSACTIONS ON SIGNAL PROCESSING Robust L 2 E Estimation of Transformation for Non-Rigid Registration Jiayi Ma, Weichao Qiu, Ji Zhao, Yong Ma, Alan L. Yuille, and Zhuowen Tu Abstract We introduce

More information

Understanding Clustering Supervising the unsupervised

Understanding Clustering Supervising the unsupervised Understanding Clustering Supervising the unsupervised Janu Verma IBM T.J. Watson Research Center, New York http://jverma.github.io/ jverma@us.ibm.com @januverma Clustering Grouping together similar data

More information

Speech Recognition Lecture 8: Acoustic Models. Eugene Weinstein Google, NYU Courant Institute Slide Credit: Mehryar Mohri

Speech Recognition Lecture 8: Acoustic Models. Eugene Weinstein Google, NYU Courant Institute Slide Credit: Mehryar Mohri Speech Recognition Lecture 8: Acoustic Models. Eugene Weinstein Google, NYU Courant Institute eugenew@cs.nyu.edu Slide Credit: Mehryar Mohri Speech Recognition Components Acoustic and pronunciation model:

More information

Clustering CS 550: Machine Learning

Clustering CS 550: Machine Learning Clustering CS 550: Machine Learning This slide set mainly uses the slides given in the following links: http://www-users.cs.umn.edu/~kumar/dmbook/ch8.pdf http://www-users.cs.umn.edu/~kumar/dmbook/dmslides/chap8_basic_cluster_analysis.pdf

More information

( ) =cov X Y = W PRINCIPAL COMPONENT ANALYSIS. Eigenvectors of the covariance matrix are the principal components

( ) =cov X Y = W PRINCIPAL COMPONENT ANALYSIS. Eigenvectors of the covariance matrix are the principal components Review Lecture 14 ! PRINCIPAL COMPONENT ANALYSIS Eigenvectors of the covariance matrix are the principal components 1. =cov X Top K principal components are the eigenvectors with K largest eigenvalues

More information

Clustering: Classic Methods and Modern Views

Clustering: Classic Methods and Modern Views Clustering: Classic Methods and Modern Views Marina Meilă University of Washington mmp@stat.washington.edu June 22, 2015 Lorentz Center Workshop on Clusters, Games and Axioms Outline Paradigms for clustering

More information

DATA MINING LECTURE 7. Hierarchical Clustering, DBSCAN The EM Algorithm

DATA MINING LECTURE 7. Hierarchical Clustering, DBSCAN The EM Algorithm DATA MINING LECTURE 7 Hierarchical Clustering, DBSCAN The EM Algorithm CLUSTERING What is a Clustering? In general a grouping of objects such that the objects in a group (cluster) are similar (or related)

More information

Hard clustering. Each object is assigned to one and only one cluster. Hierarchical clustering is usually hard. Soft (fuzzy) clustering

Hard clustering. Each object is assigned to one and only one cluster. Hierarchical clustering is usually hard. Soft (fuzzy) clustering An unsupervised machine learning problem Grouping a set of objects in such a way that objects in the same group (a cluster) are more similar (in some sense or another) to each other than to those in other

More information

CRF Based Point Cloud Segmentation Jonathan Nation

CRF Based Point Cloud Segmentation Jonathan Nation CRF Based Point Cloud Segmentation Jonathan Nation jsnation@stanford.edu 1. INTRODUCTION The goal of the project is to use the recently proposed fully connected conditional random field (CRF) model to

More information

Using Augmented Measurements to Improve the Convergence of ICP. Jacopo Serafin and Giorgio Grisetti

Using Augmented Measurements to Improve the Convergence of ICP. Jacopo Serafin and Giorgio Grisetti Jacopo Serafin and Giorgio Grisetti Point Cloud Registration We want to find the rotation and the translation that maximize the overlap between two point clouds Page 2 Point Cloud Registration We want

More information

3D Point Cloud Processing

3D Point Cloud Processing 3D Point Cloud Processing The image depicts how our robot Irma3D sees itself in a mirror. The laser looking into itself creates distortions as well as changes in intensity that give the robot a single

More information

Geometric Transformations and Image Warping

Geometric Transformations and Image Warping Geometric Transformations and Image Warping Ross Whitaker SCI Institute, School of Computing University of Utah Univ of Utah, CS6640 2009 1 Geometric Transformations Greyscale transformations -> operate

More information

Part I. Hierarchical clustering. Hierarchical Clustering. Hierarchical clustering. Produces a set of nested clusters organized as a

Part I. Hierarchical clustering. Hierarchical Clustering. Hierarchical clustering. Produces a set of nested clusters organized as a Week 9 Based in part on slides from textbook, slides of Susan Holmes Part I December 2, 2012 Hierarchical Clustering 1 / 1 Produces a set of nested clusters organized as a Hierarchical hierarchical clustering

More information

Unsupervised Learning: Clustering

Unsupervised Learning: Clustering Unsupervised Learning: Clustering Vibhav Gogate The University of Texas at Dallas Slides adapted from Carlos Guestrin, Dan Klein & Luke Zettlemoyer Machine Learning Supervised Learning Unsupervised Learning

More information

Latent Variable Models and Expectation Maximization

Latent Variable Models and Expectation Maximization Latent Variable Models and Expectation Maximization Oliver Schulte - CMPT 726 Bishop PRML Ch. 9 2 4 6 8 1 12 14 16 18 2 4 6 8 1 12 14 16 18 5 1 15 2 25 5 1 15 2 25 2 4 6 8 1 12 14 2 4 6 8 1 12 14 5 1 15

More information

Fitting D.A. Forsyth, CS 543

Fitting D.A. Forsyth, CS 543 Fitting D.A. Forsyth, CS 543 Fitting Choose a parametric object/some objects to represent a set of tokens Most interesting case is when criterion is not local can t tell whether a set of points lies on

More information

Markov Random Fields and Gibbs Sampling for Image Denoising

Markov Random Fields and Gibbs Sampling for Image Denoising Markov Random Fields and Gibbs Sampling for Image Denoising Chang Yue Electrical Engineering Stanford University changyue@stanfoed.edu Abstract This project applies Gibbs Sampling based on different Markov

More information

CS 231A Computer Vision (Fall 2012) Problem Set 3

CS 231A Computer Vision (Fall 2012) Problem Set 3 CS 231A Computer Vision (Fall 2012) Problem Set 3 Due: Nov. 13 th, 2012 (2:15pm) 1 Probabilistic Recursion for Tracking (20 points) In this problem you will derive a method for tracking a point of interest

More information

Learning-based Neuroimage Registration

Learning-based Neuroimage Registration Learning-based Neuroimage Registration Leonid Teverovskiy and Yanxi Liu 1 October 2004 CMU-CALD-04-108, CMU-RI-TR-04-59 School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Abstract

More information

Machine Learning (BSMC-GA 4439) Wenke Liu

Machine Learning (BSMC-GA 4439) Wenke Liu Machine Learning (BSMC-GA 4439) Wenke Liu 01-31-017 Outline Background Defining proximity Clustering methods Determining number of clusters Comparing two solutions Cluster analysis as unsupervised Learning

More information

Clustering. CS294 Practical Machine Learning Junming Yin 10/09/06

Clustering. CS294 Practical Machine Learning Junming Yin 10/09/06 Clustering CS294 Practical Machine Learning Junming Yin 10/09/06 Outline Introduction Unsupervised learning What is clustering? Application Dissimilarity (similarity) of objects Clustering algorithm K-means,

More information

Expectation-Maximization. Nuno Vasconcelos ECE Department, UCSD

Expectation-Maximization. Nuno Vasconcelos ECE Department, UCSD Expectation-Maximization Nuno Vasconcelos ECE Department, UCSD Plan for today last time we started talking about mixture models we introduced the main ideas behind EM to motivate EM, we looked at classification-maximization

More information

Data Preprocessing. Javier Béjar AMLT /2017 CS - MAI. (CS - MAI) Data Preprocessing AMLT / / 71 BY: $\

Data Preprocessing. Javier Béjar AMLT /2017 CS - MAI. (CS - MAI) Data Preprocessing AMLT / / 71 BY: $\ Data Preprocessing S - MAI AMLT - 2016/2017 (S - MAI) Data Preprocessing AMLT - 2016/2017 1 / 71 Outline 1 Introduction Data Representation 2 Data Preprocessing Outliers Missing Values Normalization Discretization

More information

K-Means Clustering 3/3/17

K-Means Clustering 3/3/17 K-Means Clustering 3/3/17 Unsupervised Learning We have a collection of unlabeled data points. We want to find underlying structure in the data. Examples: Identify groups of similar data points. Clustering

More information

COMP 551 Applied Machine Learning Lecture 13: Unsupervised learning

COMP 551 Applied Machine Learning Lecture 13: Unsupervised learning COMP 551 Applied Machine Learning Lecture 13: Unsupervised learning Associate Instructor: Herke van Hoof (herke.vanhoof@mail.mcgill.ca) Slides mostly by: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/comp551

More information

Content-based image and video analysis. Machine learning

Content-based image and video analysis. Machine learning Content-based image and video analysis Machine learning for multimedia retrieval 04.05.2009 What is machine learning? Some problems are very hard to solve by writing a computer program by hand Almost all

More information

Inference and Representation

Inference and Representation Inference and Representation Rachel Hodos New York University Lecture 5, October 6, 2015 Rachel Hodos Lecture 5: Inference and Representation Today: Learning with hidden variables Outline: Unsupervised

More information

Data Analysis 3. Support Vector Machines. Jan Platoš October 30, 2017

Data Analysis 3. Support Vector Machines. Jan Platoš October 30, 2017 Data Analysis 3 Support Vector Machines Jan Platoš October 30, 2017 Department of Computer Science Faculty of Electrical Engineering and Computer Science VŠB - Technical University of Ostrava Table of

More information

Comparative Study of ROI Extraction of Palmprint

Comparative Study of ROI Extraction of Palmprint 251 Comparative Study of ROI Extraction of Palmprint 1 Milind E. Rane, 2 Umesh S Bhadade 1,2 SSBT COE&T, North Maharashtra University Jalgaon, India Abstract - The Palmprint region segmentation is an important

More information

Final Exam Assigned: 11/21/02 Due: 12/05/02 at 2:30pm

Final Exam Assigned: 11/21/02 Due: 12/05/02 at 2:30pm 6.801/6.866 Machine Vision Final Exam Assigned: 11/21/02 Due: 12/05/02 at 2:30pm Problem 1 Line Fitting through Segmentation (Matlab) a) Write a Matlab function to generate noisy line segment data with

More information

Markov Random Fields and Segmentation with Graph Cuts

Markov Random Fields and Segmentation with Graph Cuts Markov Random Fields and Segmentation with Graph Cuts Computer Vision Jia-Bin Huang, Virginia Tech Many slides from D. Hoiem Administrative stuffs Final project Proposal due Oct 27 (Thursday) HW 4 is out

More information

Processing 3D Surface Data

Processing 3D Surface Data Processing 3D Surface Data Computer Animation and Visualisation Lecture 12 Institute for Perception, Action & Behaviour School of Informatics 3D Surfaces 1 3D surface data... where from? Iso-surfacing

More information

Mixture Models and EM

Mixture Models and EM Table of Content Chapter 9 Mixture Models and EM -means Clustering Gaussian Mixture Models (GMM) Expectation Maximiation (EM) for Mixture Parameter Estimation Introduction Mixture models allows Complex

More information

Machine Learning (BSMC-GA 4439) Wenke Liu

Machine Learning (BSMC-GA 4439) Wenke Liu Machine Learning (BSMC-GA 4439) Wenke Liu 01-25-2018 Outline Background Defining proximity Clustering methods Determining number of clusters Other approaches Cluster analysis as unsupervised Learning Unsupervised

More information

Learning from Data Mixture Models

Learning from Data Mixture Models Learning from Data Mixture Models Copyright David Barber 2001-2004. Course lecturer: Amos Storkey a.storkey@ed.ac.uk Course page : http://www.anc.ed.ac.uk/ amos/lfd/ 1 It is not uncommon for data to come

More information

ADVANCED MACHINE LEARNING MACHINE LEARNING. Kernel for Clustering kernel K-Means

ADVANCED MACHINE LEARNING MACHINE LEARNING. Kernel for Clustering kernel K-Means 1 MACHINE LEARNING Kernel for Clustering ernel K-Means Outline of Today s Lecture 1. Review principle and steps of K-Means algorithm. Derive ernel version of K-means 3. Exercise: Discuss the geometrical

More information

Uncertainties: Representation and Propagation & Line Extraction from Range data

Uncertainties: Representation and Propagation & Line Extraction from Range data 41 Uncertainties: Representation and Propagation & Line Extraction from Range data 42 Uncertainty Representation Section 4.1.3 of the book Sensing in the real world is always uncertain How can uncertainty

More information

Clustering K-means. Machine Learning CSEP546 Carlos Guestrin University of Washington February 18, Carlos Guestrin

Clustering K-means. Machine Learning CSEP546 Carlos Guestrin University of Washington February 18, Carlos Guestrin Clustering K-means Machine Learning CSEP546 Carlos Guestrin University of Washington February 18, 2014 Carlos Guestrin 2005-2014 1 Clustering images Set of Images [Goldberger et al.] Carlos Guestrin 2005-2014

More information

Robust Point Matching for Two-Dimensional Nonrigid Shapes

Robust Point Matching for Two-Dimensional Nonrigid Shapes Robust Point Matching for Two-Dimensional Nonrigid Shapes Yefeng Zheng and David Doermann Language and Media Processing Laboratory Institute for Advanced Computer Studies University of Maryland, College

More information

Intensity Augmented ICP for Registration of Laser Scanner Point Clouds

Intensity Augmented ICP for Registration of Laser Scanner Point Clouds Intensity Augmented ICP for Registration of Laser Scanner Point Clouds Bharat Lohani* and Sandeep Sashidharan *Department of Civil Engineering, IIT Kanpur Email: blohani@iitk.ac.in. Abstract While using

More information

Updated Sections 3.5 and 3.6

Updated Sections 3.5 and 3.6 Addendum The authors recommend the replacement of Sections 3.5 3.6 and Table 3.15 with the content of this addendum. Consequently, the recommendation is to replace the 13 models and their weights with

More information

Dynamic Thresholding for Image Analysis

Dynamic Thresholding for Image Analysis Dynamic Thresholding for Image Analysis Statistical Consulting Report for Edward Chan Clean Energy Research Center University of British Columbia by Libo Lu Department of Statistics University of British

More information

Homework #4 Programming Assignment Due: 11:59 pm, November 4, 2018

Homework #4 Programming Assignment Due: 11:59 pm, November 4, 2018 CSCI 567, Fall 18 Haipeng Luo Homework #4 Programming Assignment Due: 11:59 pm, ovember 4, 2018 General instructions Your repository will have now a directory P4/. Please do not change the name of this

More information

Segmentation: Clustering, Graph Cut and EM

Segmentation: Clustering, Graph Cut and EM Segmentation: Clustering, Graph Cut and EM Ying Wu Electrical Engineering and Computer Science Northwestern University, Evanston, IL 60208 yingwu@northwestern.edu http://www.eecs.northwestern.edu/~yingwu

More information

Unsupervised Learning : Clustering

Unsupervised Learning : Clustering Unsupervised Learning : Clustering Things to be Addressed Traditional Learning Models. Cluster Analysis K-means Clustering Algorithm Drawbacks of traditional clustering algorithms. Clustering as a complex

More information

Supervised Learning for Image Segmentation

Supervised Learning for Image Segmentation Supervised Learning for Image Segmentation Raphael Meier 06.10.2016 Raphael Meier MIA 2016 06.10.2016 1 / 52 References A. Ng, Machine Learning lecture, Stanford University. A. Criminisi, J. Shotton, E.

More information

CSE 5243 INTRO. TO DATA MINING

CSE 5243 INTRO. TO DATA MINING CSE 5243 INTRO. TO DATA MINING Cluster Analysis: Basic Concepts and Methods Huan Sun, CSE@The Ohio State University 09/25/2017 Slides adapted from UIUC CS412, Fall 2017, by Prof. Jiawei Han 2 Chapter 10.

More information

Unsupervised Learning

Unsupervised Learning Networks for Pattern Recognition, 2014 Networks for Single Linkage K-Means Soft DBSCAN PCA Networks for Kohonen Maps Linear Vector Quantization Networks for Problems/Approaches in Machine Learning Supervised

More information

Methods for Intelligent Systems

Methods for Intelligent Systems Methods for Intelligent Systems Lecture Notes on Clustering (II) Davide Eynard eynard@elet.polimi.it Department of Electronics and Information Politecnico di Milano Davide Eynard - Lecture Notes on Clustering

More information

Expectation Maximization. Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University

Expectation Maximization. Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University Expectation Maximization Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University April 10 th, 2006 1 Announcements Reminder: Project milestone due Wednesday beginning of class 2 Coordinate

More information

Grundlagen der Künstlichen Intelligenz

Grundlagen der Künstlichen Intelligenz Grundlagen der Künstlichen Intelligenz Unsupervised learning Daniel Hennes 29.01.2018 (WS 2017/18) University Stuttgart - IPVS - Machine Learning & Robotics 1 Today Supervised learning Regression (linear

More information

732A54/TDDE31 Big Data Analytics

732A54/TDDE31 Big Data Analytics 732A54/TDDE31 Big Data Analytics Lecture 10: Machine Learning with MapReduce Jose M. Peña IDA, Linköping University, Sweden 1/27 Contents MapReduce Framework Machine Learning with MapReduce Neural Networks

More information

Introduction to Machine Learning

Introduction to Machine Learning Department of Computer Science, University of Helsinki Autumn 2009, second term Session 8, November 27 th, 2009 1 2 3 Multiplicative Updates for L1-Regularized Linear and Logistic Last time I gave you

More information

Shape Contexts. Newton Petersen 4/25/2008

Shape Contexts. Newton Petersen 4/25/2008 Shape Contexts Newton Petersen 4/25/28 "Shape Matching and Object Recognition Using Shape Contexts", Belongie et al. PAMI April 22 Agenda Study Matlab code for computing shape context Look at limitations

More information