08 An Introduction to Dense Continuous Robotic Mapping
|
|
- Beverly McKenzie
- 5 years ago
- Views:
Transcription
1 NAVARCH/EECS 568, ROB Winter An Introduction to Dense Continuous Robotic Mapping Maani Ghaffari March 14, 2018
2 Previously: Occupancy Grid Maps Pose SLAM graph and its associated dense occupancy grid map
3 Extension to 3D Occupancy Maps: OctoMap An Efficient Probabilistic 3D Mapping Framework Based on Octrees: Freiburg campus (left) and Freiburg building 079 (right) 3
4 Semantic OctoMap Given a semantically labeled point cloud, compute a semantic class label per voxel: Stanford 2D-3D-Semantics Dataset (left) and NYU Depth Dataset V2 (right) 4
5 Semantic OctoMap + Conditional Random Fields (CRFs) Conditional Random Fields-based octree map with a voxel resolution of 0.1m: Multi-class CRF-based OctoMap; Courtesy D. Lang, Semantic 3D Octree Maps based on Conditional Random Fields 5
6 Semantic OctoMap + Conditional Random Fields (CRFs) By using CRFs on top of octree maps we can account for local correlation between voxels in the map, relaxing the fundamental assumption of marginalized inference per voxel. Multi-class CRF-based OctoMap; Courtesy: D. Lang, Semantic 3D Octree Maps based on Conditional Random Fields 6
7 Q. Where do we get segmented point clouds!? Labeled (Segmented) Point Cloud 2D image segmentation and transferring to 3D; for example see: Fully Convolutional Networks for Semantic Segmentation, or SegNet. Directly labeling point clouds in 3D; for example see: PointNet. Active area, probability more works will appear as we talk... Maybe in a few years we can buy a camera that has an embedded AI for producing such high-level measurements! 7
8 Labeled (Segmented) Point Cloud Segmented point clouds can be seen as high-level measurements or prior over a local area of the semantic map: Finding a good prior can be hard; Informative prior = better posterior or more efficient inference! Many advances in deep learning can be used for better initialization and prior over the state of desired random variables. 8
9 Motivation Can we build a model of the environment that takes into account structural and semantic correlation though joint inference; is continuous and queries can be made at any resolution; can deal with sparse measurements; and is computationally tractable. 9
10 Continuous Robotic Mapping Continuous mapping techniques are based on regression analysis and classification methods (supervised learning): heavily studied in statistics and machine learning: K. P. Murphy, Machine Learning: A Probabilistic Perspective. C. Bishop, Pattern Recognition and Machine Learning. C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning. B. Schölkopf and A. J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. supervised learning problems have been studied for more than a century in statistics; we can apply many of the available methods with minor modifications to robotic mapping problem. 10
11 Continuous Robotic Mapping Supervised learning can be divided into regression and classification problems: regression is concerned with the prediction of continuous quantities; classification assigns an input pattern to an output class where outputs are discrete class labels. 11
12 Bayesian vs. Non-Bayesian Inference approach: Bayesian which is preferred as it gives the posterior distribution as the output. Non-Bayesian relies on point-estimation where often no notion of uncertainty is available at prediction. 12
13 Continuous Robotic Mapping We can use binary models for occupancy mapping and multi-class models for semantic mapping. Some of the available options: Kernelized Bayesian logistic regression; Relevance Vector Machine (RVM): Bayesian, sparse, fast. Gaussian Processes (GPs): fully Bayesian, exact inference in regression if Gaussian likelihood used, slower with cubic time complexity. 13
14 Today: Gaussian Processes An Introduction to Gaussian Processes 14
15 Gaussian Processes What is a Gaussian Process (GP)? GPs are nonparametric Bayesian regression techniques that employ statistical inference to learn dependencies between points in a data set. 15
16 Gaussian Processes What is a Gaussian Process (GP)? GPs are nonparametric Bayesian regression techniques that employ statistical inference to learn dependencies between points in a data set. Definition (Gaussian process) A Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution. 15
17 Gaussian Processes What is a Gaussian Process? It s a generalization of the multi-variate normal distribution. GP places distributions over functions rather than vectors. It s a non-parametric kernel method. 16
18 Output, f(x) The functions drawn at random from the GP prior. 1.5 Gaussian Processes 1 prior Input, x 17
19 Output, f(x) Gaussian Processes The functions drawn at random from the GP predictive posterior posterior Input, x 18
20 Gaussian Processes Observations are noisy due to the sensor noise: y = f(x) + ɛ y where ɛ y N (0, σ 2 n) The joint distribution of the observed target values, y, and the function values (the latent variable), f, at the query points: [ [ ] y K(X, X) + σ 2 N (0, n I n K(X, X ) ) f ] K(X, X) K(X, X ) 19
21 Gaussian Processes The joint distribution of the observed target values, y, and the function values (the latent variable), f, at the query points: [ [ ] y K(X, X) + σ 2 N (0, n I n K(X, X ) ) f ] K(X, X) K(X, X ) k(, ): covariance function. x : query/test point. y: target (observations). X: d n design matrix of aggregated input vectors (training points). σ 2 n: sensor noise variance. I n : identity matrix of size n. f : latent variable. 20
22 Gaussian Processes The predictive conditional distribution for a single query point f X, y, x N (E[f ], V[f ]): E[f ] = k(x, x ) T [K(X, X) + σ 2 ni n ] 1 y V[f ] = k(x, x ) k(x, x ) T [K(X, X) + σ 2 ni n ] 1 k(x, x ) 21
23 covariance, k(r) Examples of covariance functions using unit length-scale. Gaussian Processes Squared Exponential Matern 8 = 5/2 Sparse distance, r 22
24 Gaussian Processes From left, Squared Exponential, Matérn ν = 5/2, and Sparse covariance functions. 23
25 Gaussian Processes: Model Selection Hyperparameters: free parameters of mean, covariance, and likelihood functions. Learning hyperparameters θ: minimize the negative log of the marginal likelihood function. log p(y X, θ) = 1 2 yt (K(X, X) + σ 2 ni n ) 1 y 1 2 log K(X, X) + σ2 ni n n log 2π 2 24
26 Gaussian Processes: Model Selection log p(y X, θ) = 1 2 yt (K(X, X) + σ 2 ni n ) 1 y 1 2 log K(X, X) + σ2 ni n n log 2π yt (K(X, X) + σni 2 n ) 1 y : data-fit. 1 2 log K(X, X) + σ2 ni n : penalizes the model complexity. n 2 log 2π: constant. 25
27 Gaussian Processes Classification (GPC) Supervised classification: the problem of learning input-output mappings from a training dataset for discrete outputs (class labels). Binary GP Classification: define class labels as y {±1}. 26
28 Gaussian Processes Classification In GPC, the inference is performed in two steps. first computing the predictive distribution of the latent variable corresponding to a query case, f X, y, x N (E[f ], V[f ]). then a probabilistic prediction, p(y = +1 X, y, x ), using a sigmoid function, σ( ), that assigns class labels with a probability that increases monotonically with the latent. 27
29 A Direct Application Example: Semantic Mapping Gaussian Processes Semantic Map (GPSM) 28
30 Features of GPSM Gaussian Processes Semantic Map uses raw pixelated semantic measurements (class labels) in the map inference process. generalizes the traditional occupied and unoccupied class assignment, i.e. binary to multi-class classification. infers the structural and semantic correlation from measurements rather than resorting to assumptions; therefore, can infer missing labels and deal with sparse measurements. 29
31 Features of GPSM is continuous, and queries can be made at any desired locations; therefore, the map can be inferred with any resolution. is agnostic to the input dimensions and can handle an arbitrary number of non-spatial dimensions. 30
32 Problem Statement Problem (Gaussian processes semantic map) Given a point cloud measurement that is (possibly partially) assigned with noisy semantic class labels, infer a semantic map representation of the point cloud as a Gaussian process. 31
33 Problem Formulation Map: n m -tuple random variable (M [1],..., M [nm] ) such that m [i] N (µ [i], v [i] ), i {1: n m }. 32
34 Problem Formulation Map: n m -tuple random variable (M [1],..., M [nm] ) such that m [i] N (µ [i], v [i] ), i {1: n m }. Spatial coordinates of the map: X R 3. 32
35 Problem Formulation Map: n m -tuple random variable (M [1],..., M [nm] ) such that m [i] N (µ [i], v [i] ), i {1: n m }. Spatial coordinates of the map: X R 3. Semantic class labels: C = {c [j] } nc j=1. 32
36 Problem Formulation Map: n m -tuple random variable (M [1],..., M [nm] ) such that m [i] N (µ [i], v [i] ), i {1: n m }. Spatial coordinates of the map: X R 3. Semantic class labels: C = {c [j] } nc j=1. Set of possible measurements: Z X C. 32
37 Problem Formulation Map: n m -tuple random variable (M [1],..., M [nm] ) such that m [i] N (µ [i], v [i] ), i {1: n m }. Spatial coordinates of the map: X R 3. Semantic class labels: C = {c [j] } nc j=1. Set of possible measurements: Z X C. Observation: n z -tuple random variable (Z [1],..., Z [nz] ) such that z [k] Z, k {1: n z }, z [k] = (x [k], y [k] ), x [k] X, and y [k] C. 32
38 Problem Formulation Training set: D = {(x [i], y [i] )} nt i=1, D Z. Target vector: y = vec(y [1],..., y [nt] ). 33
39 Problem Formulation Given observations Z = z, we wish to estimate p(m = m Z = z). Infer the map as a Gaussian process by defining the process as the function y : Z M: y(x) GP(f m (x), k(x, x )) For any map point, m [i] = y(x [i] ) N (µ [i], v [i] ). 34
40 Problem Formulation We use one-vs.-rest multi-class classification, i.e. n c binary GPC. Once the mean and variance of the latent f are available, predict the averaged predictive probability of the class c [j] : p(c [j] = +1 D, x ) = σ(u)n (u E[f ], V[f ])du 35
41 Two-dimensional Toy Example A two-dimensional toy example of GP multi-class classification using a synthetic dataset. 36
42 Results: Sparse and Missing Measurements Effects Image (top left), ground truth (top right), Semantic OctoMap (bottom left), GPSM (bottom right) 37
43 Results: Mapping under Noisy and Misclassified Labels Image (top left), ground truth (top right), Semantic OctoMap (bottom left), GPSM (bottom right) 38
44 Computational Complexity Analysis GPSM using Fully Independent Training Conditional (FITC) approximation for large-scale inference scales as O(n t n 2 u) where n u is the number of inducing points and n u n t. Semantic OctoMap uses octree data structure and scales as O(n p log n n ), where n p and n n the number of points in the point cloud and the total number of nodes, respectively. 39
45 Potential Extensions Hierarchical structure for geometry and semantics; Topological hierarchy; Integration of object detection or instant segmentation together with scene flow estimation and SLAM for managing dynamic environments; Map fusion in continuous or discrete form depending on the application;... Extending the input space using deep kernel learning; Sparse kernels; 40
arxiv: v1 [cs.ro] 26 Nov 2018
Fast Gaussian Process Occupancy Maps Yijun Yuan, Haofei Kuang and Sören Schwertfeger arxiv:1811.10156v1 [cs.ro] 26 Nov 2018 Abstract In this paper, we demonstrate our work on Gaussian Process Occupancy
More informationPSU Student Research Symposium 2017 Bayesian Optimization for Refining Object Proposals, with an Application to Pedestrian Detection Anthony D.
PSU Student Research Symposium 2017 Bayesian Optimization for Refining Object Proposals, with an Application to Pedestrian Detection Anthony D. Rhodes 5/10/17 What is Machine Learning? Machine learning
More informationIntroduction to Mobile Robotics
Introduction to Mobile Robotics Gaussian Processes Wolfram Burgard Cyrill Stachniss Giorgio Grisetti Maren Bennewitz Christian Plagemann SS08, University of Freiburg, Department for Computer Science Announcement
More informationScene Grammars, Factor Graphs, and Belief Propagation
Scene Grammars, Factor Graphs, and Belief Propagation Pedro Felzenszwalb Brown University Joint work with Jeroen Chua Probabilistic Scene Grammars General purpose framework for image understanding and
More informationScene Grammars, Factor Graphs, and Belief Propagation
Scene Grammars, Factor Graphs, and Belief Propagation Pedro Felzenszwalb Brown University Joint work with Jeroen Chua Probabilistic Scene Grammars General purpose framework for image understanding and
More informationLearning and Recognizing Visual Object Categories Without First Detecting Features
Learning and Recognizing Visual Object Categories Without First Detecting Features Daniel Huttenlocher 2007 Joint work with D. Crandall and P. Felzenszwalb Object Category Recognition Generic classes rather
More informationIntroduction to Mobile Robotics Techniques for 3D Mapping
Introduction to Mobile Robotics Techniques for 3D Mapping Wolfram Burgard, Michael Ruhnke, Bastian Steder 1 Why 3D Representations Robots live in the 3D world. 2D maps have been applied successfully for
More informationGaussian Processes Semantic Map Representation
Gaussian Processes Semantic Map Representation Maani Ghaffari Jadidi, Lu Gan, Steven A. Parkison, Jie Li, and Ryan M. Eustice Perceptual Robotics Laboratory, Department of Naval Architecture and Marine
More informationDeep Generative Models Variational Autoencoders
Deep Generative Models Variational Autoencoders Sudeshna Sarkar 5 April 2017 Generative Nets Generative models that represent probability distributions over multiple variables in some way. Directed Generative
More informationWarped Mixture Models
Warped Mixture Models Tomoharu Iwata, David Duvenaud, Zoubin Ghahramani Cambridge University Computational and Biological Learning Lab March 11, 2013 OUTLINE Motivation Gaussian Process Latent Variable
More informationGeometric Reconstruction Dense reconstruction of scene geometry
Lecture 5. Dense Reconstruction and Tracking with Real-Time Applications Part 2: Geometric Reconstruction Dr Richard Newcombe and Dr Steven Lovegrove Slide content developed from: [Newcombe, Dense Visual
More informationProbabilistic Graphical Models
Overview of Part Two Probabilistic Graphical Models Part Two: Inference and Learning Christopher M. Bishop Exact inference and the junction tree MCMC Variational methods and EM Example General variational
More informationExploration on Continuous Gaussian Process Frontier Maps
Exploration on Continuous Gaussian Process Frontier Maps Maani Ghaffari Jadidi, Jaime Valls Miró, Rafael Valencia and Juan Andrade-Cetto Abstract An information-driven autonomous robotic exploration method
More informationarxiv: v2 [stat.ml] 5 Nov 2018
Kernel Distillation for Fast Gaussian Processes Prediction arxiv:1801.10273v2 [stat.ml] 5 Nov 2018 Congzheng Song Cornell Tech cs2296@cornell.edu Abstract Yiming Sun Cornell University ys784@cornell.edu
More informationHierarchical Gaussian Processes for Robust and Accurate Map Building
Hierarchical Gaussian Processes for Robust and Accurate Map Building Soohwan Kim, Jonghyuk Kim The Australian National University, Australia {soohwan.kim, jonghyuk.kim}@anu.edu.au Abstract This paper proposes
More informationLearning Inverse Dynamics: a Comparison
Learning Inverse Dynamics: a Comparison Duy Nguyen-Tuong, Jan Peters, Matthias Seeger, Bernhard Schölkopf Max Planck Institute for Biological Cybernetics Spemannstraße 38, 72076 Tübingen - Germany Abstract.
More informationGas Distribution Modeling Using Sparse Gaussian Process Mixture Models
Gas Distribution Modeling Using Sparse Gaussian Process Mixture Models Cyrill Stachniss, Christian Plagemann, Achim Lilienthal, Wolfram Burgard University of Freiburg, Germany & Örebro University, Sweden
More informationSupervised Learning for Image Segmentation
Supervised Learning for Image Segmentation Raphael Meier 06.10.2016 Raphael Meier MIA 2016 06.10.2016 1 / 52 References A. Ng, Machine Learning lecture, Stanford University. A. Criminisi, J. Shotton, E.
More informationCOSC160: Detection and Classification. Jeremy Bolton, PhD Assistant Teaching Professor
COSC160: Detection and Classification Jeremy Bolton, PhD Assistant Teaching Professor Outline I. Problem I. Strategies II. Features for training III. Using spatial information? IV. Reducing dimensionality
More informationHumanoid Robotics. Monte Carlo Localization. Maren Bennewitz
Humanoid Robotics Monte Carlo Localization Maren Bennewitz 1 Basis Probability Rules (1) If x and y are independent: Bayes rule: Often written as: The denominator is a normalizing constant that ensures
More informationContent-based image and video analysis. Machine learning
Content-based image and video analysis Machine learning for multimedia retrieval 04.05.2009 What is machine learning? Some problems are very hard to solve by writing a computer program by hand Almost all
More informationFMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu
FMA901F: Machine Learning Lecture 3: Linear Models for Regression Cristian Sminchisescu Machine Learning: Frequentist vs. Bayesian In the frequentist setting, we seek a fixed parameter (vector), with value(s)
More informationBayesian Generalized Kernel Inference for Occupancy Map Prediction
Bayesian Generalized Kernel Inference for Occupancy Map Prediction Kevin Doherty, Jinkun Wang, and Brendan Englot Abstract We consider the problem of building accurate and descriptive 3D occupancy maps
More informationWhat is machine learning?
Machine learning, pattern recognition and statistical data modelling Lecture 12. The last lecture Coryn Bailer-Jones 1 What is machine learning? Data description and interpretation finding simpler relationship
More informationL17. OCCUPANCY MAPS. NA568 Mobile Robotics: Methods & Algorithms
L17. OCCUPANCY MAPS NA568 Mobile Robotics: Methods & Algorithms Today s Topic Why Occupancy Maps? Bayes Binary Filters Log-odds Occupancy Maps Inverse sensor model Learning inverse sensor model ML map
More informationIntroduction to Mobile Robotics SLAM Landmark-based FastSLAM
Introduction to Mobile Robotics SLAM Landmark-based FastSLAM Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Diego Tipaldi, Luciano Spinello Partial slide courtesy of Mike Montemerlo 1 The SLAM Problem
More informationGaussian Processes for Robotics. McGill COMP 765 Oct 24 th, 2017
Gaussian Processes for Robotics McGill COMP 765 Oct 24 th, 2017 A robot must learn Modeling the environment is sometimes an end goal: Space exploration Disaster recovery Environmental monitoring Other
More informationNote Set 4: Finite Mixture Models and the EM Algorithm
Note Set 4: Finite Mixture Models and the EM Algorithm Padhraic Smyth, Department of Computer Science University of California, Irvine Finite Mixture Models A finite mixture model with K components, for
More informationGenerative and discriminative classification techniques
Generative and discriminative classification techniques Machine Learning and Category Representation 2014-2015 Jakob Verbeek, November 28, 2014 Course website: http://lear.inrialpes.fr/~verbeek/mlcr.14.15
More informationStructured Models in. Dan Huttenlocher. June 2010
Structured Models in Computer Vision i Dan Huttenlocher June 2010 Structured Models Problems where output variables are mutually dependent or constrained E.g., spatial or temporal relations Such dependencies
More informationVariational Methods for Discrete-Data Latent Gaussian Models
Variational Methods for Discrete-Data Latent Gaussian Models University of British Columbia Vancouver, Canada March 6, 2012 The Big Picture Joint density models for data with mixed data types Bayesian
More information10-701/15-781, Fall 2006, Final
-7/-78, Fall 6, Final Dec, :pm-8:pm There are 9 questions in this exam ( pages including this cover sheet). If you need more room to work out your answer to a question, use the back of the page and clearly
More informationTurning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018
Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018 Asaf Moses Systematics Ltd., Technical Product Manager aviasafm@systematics.co.il 1 Autonomous
More informationPractical Course WS12/13 Introduction to Monte Carlo Localization
Practical Course WS12/13 Introduction to Monte Carlo Localization Cyrill Stachniss and Luciano Spinello 1 State Estimation Estimate the state of a system given observations and controls Goal: 2 Bayes Filter
More informationHomework. Gaussian, Bishop 2.3 Non-parametric, Bishop 2.5 Linear regression Pod-cast lecture on-line. Next lectures:
Homework Gaussian, Bishop 2.3 Non-parametric, Bishop 2.5 Linear regression 3.0-3.2 Pod-cast lecture on-line Next lectures: I posted a rough plan. It is flexible though so please come with suggestions Bayes
More informationGraphical Models, Bayesian Method, Sampling, and Variational Inference
Graphical Models, Bayesian Method, Sampling, and Variational Inference With Application in Function MRI Analysis and Other Imaging Problems Wei Liu Scientific Computing and Imaging Institute University
More informationDiscrete Optimization of Ray Potentials for Semantic 3D Reconstruction
Discrete Optimization of Ray Potentials for Semantic 3D Reconstruction Marc Pollefeys Joined work with Nikolay Savinov, Christian Haene, Lubor Ladicky 2 Comparison to Volumetric Fusion Higher-order ray
More informationComputer Vision II Lecture 14
Computer Vision II Lecture 14 Articulated Tracking I 08.07.2014 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de leibe@vision.rwth-aachen.de Outline of This Lecture Single-Object Tracking Bayesian
More informationFast, Accurate Gaussian Process Occupancy Maps via Test-Data Octrees and Nested Bayesian Fusion
Fast, Accurate Gaussian Process Occupancy Maps via Test-Data Octrees and Nested Bayesian Fusion Jinkun Wang and Brendan Englot Abstract We present a novel algorithm to produce descriptive online 3D occupancy
More informationMultiple-Choice Questionnaire Group C
Family name: Vision and Machine-Learning Given name: 1/28/2011 Multiple-Choice naire Group C No documents authorized. There can be several right answers to a question. Marking-scheme: 2 points if all right
More informationLecture 21 : A Hybrid: Deep Learning and Graphical Models
10-708: Probabilistic Graphical Models, Spring 2018 Lecture 21 : A Hybrid: Deep Learning and Graphical Models Lecturer: Kayhan Batmanghelich Scribes: Paul Liang, Anirudha Rayasam 1 Introduction and Motivation
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 12 Combining
More informationSemantic Mapping and Reasoning Approach for Mobile Robotics
Semantic Mapping and Reasoning Approach for Mobile Robotics Caner GUNEY, Serdar Bora SAYIN, Murat KENDİR, Turkey Key words: Semantic mapping, 3D mapping, probabilistic, robotic surveying, mine surveying
More informationConditional Random Fields as Recurrent Neural Networks
BIL722 - Deep Learning for Computer Vision Conditional Random Fields as Recurrent Neural Networks S. Zheng, S. Jayasumana, B. Romera-Paredes V. Vineet, Z. Su, D. Du, C. Huang, P.H.S. Torr Introduction
More informationApplying Supervised Learning
Applying Supervised Learning When to Consider Supervised Learning A supervised learning algorithm takes a known set of input data (the training set) and known responses to the data (output), and trains
More informationProbabilistic Robotics
Probabilistic Robotics Sebastian Thrun Wolfram Burgard Dieter Fox The MIT Press Cambridge, Massachusetts London, England Preface xvii Acknowledgments xix I Basics 1 1 Introduction 3 1.1 Uncertainty in
More informationObject Recognition Using Pictorial Structures. Daniel Huttenlocher Computer Science Department. In This Talk. Object recognition in computer vision
Object Recognition Using Pictorial Structures Daniel Huttenlocher Computer Science Department Joint work with Pedro Felzenszwalb, MIT AI Lab In This Talk Object recognition in computer vision Brief definition
More informationInstance-based Learning CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2015
Instance-based Learning CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2015 Outline Non-parametric approach Unsupervised: Non-parametric density estimation Parzen Windows K-Nearest
More informationCOMP90051 Statistical Machine Learning
COMP90051 Statistical Machine Learning Semester 2, 2016 Lecturer: Trevor Cohn 20. PGM Representation Next Lectures Representation of joint distributions Conditional/marginal independence * Directed vs
More informationWhat is the SLAM problem?
SLAM Tutorial Slides by Marios Xanthidis, C. Stachniss, P. Allen, C. Fermuller Paul Furgale, Margarita Chli, Marco Hutter, Martin Rufli, Davide Scaramuzza, Roland Siegwart What is the SLAM problem? The
More informationClustering Lecture 5: Mixture Model
Clustering Lecture 5: Mixture Model Jing Gao SUNY Buffalo 1 Outline Basics Motivation, definition, evaluation Methods Partitional Hierarchical Density-based Mixture model Spectral methods Advanced topics
More informationMixture of Gaussian Processes for Combining Multiple Modalities
Mixture of Gaussian Processes for Combining Multiple Modalities Ashish Kapoor, Hyungil Ahn, and Rosalind W. Picard MIT Media Lab, Cambridge MA 239, USA, {kapoor, hiahn, picard}@media.mit.edu, WWW home
More informationGradient of the lower bound
Weakly Supervised with Latent PhD advisor: Dr. Ambedkar Dukkipati Department of Computer Science and Automation gaurav.pandey@csa.iisc.ernet.in Objective Given a training set that comprises image and image-level
More informationlow bias high variance high bias low variance error test set training set high low Model Complexity Typical Behaviour Lecture 11:
Lecture 11: Overfitting and Capacity Control high bias low variance Typical Behaviour low bias high variance Sam Roweis error test set training set November 23, 4 low Model Complexity high Generalization,
More informationECE521: Week 11, Lecture March 2017: HMM learning/inference. With thanks to Russ Salakhutdinov
ECE521: Week 11, Lecture 20 27 March 2017: HMM learning/inference With thanks to Russ Salakhutdinov Examples of other perspectives Murphy 17.4 End of Russell & Norvig 15.2 (Artificial Intelligence: A Modern
More informationDirect Methods in Visual Odometry
Direct Methods in Visual Odometry July 24, 2017 Direct Methods in Visual Odometry July 24, 2017 1 / 47 Motivation for using Visual Odometry Wheel odometry is affected by wheel slip More accurate compared
More informationA Switched Gaussian Process for Estimating Disparity and Segmentation in Binocular Stereo
A Switched Gaussian Process for Estimating Disparity and Segmentation in Binocular Stereo Oliver Williams Microsoft Research Ltd. Cambridge, UK omcw2@cam.ac.uk Abstract This paper describes a Gaussian
More informationChallenges motivating deep learning. Sargur N. Srihari
Challenges motivating deep learning Sargur N. srihari@cedar.buffalo.edu 1 Topics In Machine Learning Basics 1. Learning Algorithms 2. Capacity, Overfitting and Underfitting 3. Hyperparameters and Validation
More informationApplied Bayesian Nonparametrics 5. Spatial Models via Gaussian Processes, not MRFs Tutorial at CVPR 2012 Erik Sudderth Brown University
Applied Bayesian Nonparametrics 5. Spatial Models via Gaussian Processes, not MRFs Tutorial at CVPR 2012 Erik Sudderth Brown University NIPS 2008: E. Sudderth & M. Jordan, Shared Segmentation of Natural
More informationSegmentation and Tracking of Partial Planar Templates
Segmentation and Tracking of Partial Planar Templates Abdelsalam Masoud William Hoff Colorado School of Mines Colorado School of Mines Golden, CO 800 Golden, CO 800 amasoud@mines.edu whoff@mines.edu Abstract
More informationMonte Carlo for Spatial Models
Monte Carlo for Spatial Models Murali Haran Department of Statistics Penn State University Penn State Computational Science Lectures April 2007 Spatial Models Lots of scientific questions involve analyzing
More informationLocal Gaussian Processes for Pose Recognition from Noisy Inputs
FERGIE, GALATA: LOCAL GAUSSIAN PROCESS REGRESSION WITH NOISY INPUTS 1 Local Gaussian Processes for Pose Recognition from Noisy Inputs Martin Fergie mfergie@cs.man.ac.uk Aphrodite Galata a.galata@cs.man.ac.uk
More informationLearning and Inferring Depth from Monocular Images. Jiyan Pan April 1, 2009
Learning and Inferring Depth from Monocular Images Jiyan Pan April 1, 2009 Traditional ways of inferring depth Binocular disparity Structure from motion Defocus Given a single monocular image, how to infer
More informationDeep Learning and Its Applications
Convolutional Neural Network and Its Application in Image Recognition Oct 28, 2016 Outline 1 A Motivating Example 2 The Convolutional Neural Network (CNN) Model 3 Training the CNN Model 4 Issues and Recent
More informationKernel Methods & Support Vector Machines
& Support Vector Machines & Support Vector Machines Arvind Visvanathan CSCE 970 Pattern Recognition 1 & Support Vector Machines Question? Draw a single line to separate two classes? 2 & Support Vector
More informationModel learning for robot control: a survey
Model learning for robot control: a survey Duy Nguyen-Tuong, Jan Peters 2011 Presented by Evan Beachly 1 Motivation Robots that can learn how their motors move their body Complexity Unanticipated Environments
More informationContents. Preface to the Second Edition
Preface to the Second Edition v 1 Introduction 1 1.1 What Is Data Mining?....................... 4 1.2 Motivating Challenges....................... 5 1.3 The Origins of Data Mining....................
More informationClustering. Mihaela van der Schaar. January 27, Department of Engineering Science University of Oxford
Department of Engineering Science University of Oxford January 27, 2017 Many datasets consist of multiple heterogeneous subsets. Cluster analysis: Given an unlabelled data, want algorithms that automatically
More informationCollective Classification for Labeling of Places and Objects in 2D and 3D Range Data
Collective Classification for Labeling of Places and Objects in 2D and 3D Range Data Rudolph Triebel 1, Óscar Martínez Mozos2, and Wolfram Burgard 2 1 Autonomous Systems Lab, ETH Zürich, Switzerland rudolph.triebel@mavt.ethz.ch
More informationBART STAT8810, Fall 2017
BART STAT8810, Fall 2017 M.T. Pratola November 1, 2017 Today BART: Bayesian Additive Regression Trees BART: Bayesian Additive Regression Trees Additive model generalizes the single-tree regression model:
More informationComputed Torque Control with Nonparametric Regression Models
Computed Torque Control with Nonparametric Regression Models Duy Nguyen-Tuong, Matthias Seeger, Jan Peters Max Planck Institute for Biological Cybernetics, Spemannstraße 38, 7276 Tübingen Abstract Computed
More informationLecture 20: Neural Networks for NLP. Zubin Pahuja
Lecture 20: Neural Networks for NLP Zubin Pahuja zpahuja2@illinois.edu courses.engr.illinois.edu/cs447 CS447: Natural Language Processing 1 Today s Lecture Feed-forward neural networks as classifiers simple
More informationCS 229 Midterm Review
CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask
More informationCOMP 551 Applied Machine Learning Lecture 16: Deep Learning
COMP 551 Applied Machine Learning Lecture 16: Deep Learning Instructor: Ryan Lowe (ryan.lowe@cs.mcgill.ca) Slides mostly by: Class web page: www.cs.mcgill.ca/~hvanho2/comp551 Unless otherwise noted, all
More informationerror low bias high variance test set training set high low Model Complexity Typical Behaviour 2 CSC2515 Machine Learning high bias low variance
CSC55 Machine Learning Sam Roweis high bias low variance Typical Behaviour low bias high variance Lecture : Overfitting and Capacity Control error training set test set November, 6 low Model Complexity
More informationRevising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History
Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History Simon Thompson and Satoshi Kagami Digital Human Research Center National Institute of Advanced
More informationCS839: Probabilistic Graphical Models. Lecture 10: Learning with Partially Observed Data. Theo Rekatsinas
CS839: Probabilistic Graphical Models Lecture 10: Learning with Partially Observed Data Theo Rekatsinas 1 Partially Observed GMs Speech recognition 2 Partially Observed GMs Evolution 3 Partially Observed
More informationNeural Networks and Deep Learning
Neural Networks and Deep Learning Example Learning Problem Example Learning Problem Celebrity Faces in the Wild Machine Learning Pipeline Raw data Feature extract. Feature computation Inference: prediction,
More informationMachine Learning. B. Unsupervised Learning B.1 Cluster Analysis. Lars Schmidt-Thieme
Machine Learning B. Unsupervised Learning B.1 Cluster Analysis Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany
More informationLocalization and Map Building
Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples
More informationSegmentation. Bottom up Segmentation Semantic Segmentation
Segmentation Bottom up Segmentation Semantic Segmentation Semantic Labeling of Street Scenes Ground Truth Labels 11 classes, almost all occur simultaneously, large changes in viewpoint, scale sky, road,
More informationLecture 27, April 24, Reading: See class website. Nonparametric regression and kernel smoothing. Structured sparse additive models (GroupSpAM)
School of Computer Science Probabilistic Graphical Models Structured Sparse Additive Models Junming Yin and Eric Xing Lecture 7, April 4, 013 Reading: See class website 1 Outline Nonparametric regression
More informationGlobal modelling of air pollution using multiple data sources
Global modelling of air pollution using multiple data sources Matthew Thomas M.L.Thomas@bath.ac.uk Supervised by Dr. Gavin Shaddick In collaboration with IHME and WHO June 14, 2016 1/ 1 MOTIVATION Air
More informationGenerative and discriminative classification techniques
Generative and discriminative classification techniques Machine Learning and Category Representation 013-014 Jakob Verbeek, December 13+0, 013 Course website: http://lear.inrialpes.fr/~verbeek/mlcr.13.14
More informationNon-Stationary Covariance Models for Discontinuous Functions as Applied to Aircraft Design Problems
Non-Stationary Covariance Models for Discontinuous Functions as Applied to Aircraft Design Problems Trent Lukaczyk December 1, 2012 1 Introduction 1.1 Application The NASA N+2 Supersonic Aircraft Project
More informationNonlinear State Estimation for Robotics and Computer Vision Applications: An Overview
Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview Arun Das 05/09/2017 Arun Das Waterloo Autonomous Vehicles Lab Introduction What s in a name? Arun Das Waterloo Autonomous
More informationAutomatic Tracking of Moving Objects in Video for Surveillance Applications
Automatic Tracking of Moving Objects in Video for Surveillance Applications Manjunath Narayana Committee: Dr. Donna Haverkamp (Chair) Dr. Arvin Agah Dr. James Miller Department of Electrical Engineering
More informationMachine Learning Lecture 3
Machine Learning Lecture 3 Probability Density Estimation II 19.10.2017 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de leibe@vision.rwth-aachen.de Announcements Exam dates We re in the process
More informationMachine Learning / Jan 27, 2010
Revisiting Logistic Regression & Naïve Bayes Aarti Singh Machine Learning 10-701/15-781 Jan 27, 2010 Generative and Discriminative Classifiers Training classifiers involves learning a mapping f: X -> Y,
More informationOne-Shot Learning with a Hierarchical Nonparametric Bayesian Model
One-Shot Learning with a Hierarchical Nonparametric Bayesian Model R. Salakhutdinov, J. Tenenbaum and A. Torralba MIT Technical Report, 2010 Presented by Esther Salazar Duke University June 10, 2011 E.
More informationNon-Bayesian Classifiers Part II: Linear Discriminants and Support Vector Machines
Non-Bayesian Classifiers Part II: Linear Discriminants and Support Vector Machines Selim Aksoy Department of Computer Engineering Bilkent University saksoy@cs.bilkent.edu.tr CS 551, Spring 2007 c 2007,
More informationCLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS
CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS CHAPTER 4 CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS 4.1 Introduction Optical character recognition is one of
More informationThis chapter explains two techniques which are frequently used throughout
Chapter 2 Basic Techniques This chapter explains two techniques which are frequently used throughout this thesis. First, we will introduce the concept of particle filters. A particle filter is a recursive
More informationMachine Learning. Supervised Learning. Manfred Huber
Machine Learning Supervised Learning Manfred Huber 2015 1 Supervised Learning Supervised learning is learning where the training data contains the target output of the learning system. Training data D
More informationUsing the Forest to See the Trees: Context-based Object Recognition
Using the Forest to See the Trees: Context-based Object Recognition Bill Freeman Joint work with Antonio Torralba and Kevin Murphy Computer Science and Artificial Intelligence Laboratory MIT A computer
More informationCS8803: Statistical Techniques in Robotics Byron Boots. Predicting With Hilbert Space Embeddings
CS8803: Statistical Techniques in Robotics Byron Boots Predicting With Hilbert Space Embeddings 1 HSE: Gram/Kernel Matrices bc YX = 1 N bc XX = 1 N NX (y i )'(x i ) > = 1 N Y > X 2 R 1 1 i=1 NX i=1 '(x
More informationRobotics. Lecture 8: Simultaneous Localisation and Mapping (SLAM)
Robotics Lecture 8: Simultaneous Localisation and Mapping (SLAM) See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College
More informationCS 664 Flexible Templates. Daniel Huttenlocher
CS 664 Flexible Templates Daniel Huttenlocher Flexible Template Matching Pictorial structures Parts connected by springs and appearance models for each part Used for human bodies, faces Fischler&Elschlager,
More informationEstimating Human Pose in Images. Navraj Singh December 11, 2009
Estimating Human Pose in Images Navraj Singh December 11, 2009 Introduction This project attempts to improve the performance of an existing method of estimating the pose of humans in still images. Tasks
More informationProbabilistic Robotics
Probabilistic Robotics FastSLAM Sebastian Thrun (abridged and adapted by Rodrigo Ventura in Oct-2008) The SLAM Problem SLAM stands for simultaneous localization and mapping The task of building a map while
More information