Robotics. Lecture 7: Simultaneous Localisation and Mapping (SLAM)

Similar documents
Robotics. Lecture 8: Simultaneous Localisation and Mapping (SLAM)

15 Years of Visual SLAM

Autonomous Mobile Robot Design

Robot Localization based on Geo-referenced Images and G raphic Methods

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History

CSE 527: Introduction to Computer Vision

Simuntaneous Localisation and Mapping with a Single Camera. Abhishek Aneja and Zhichao Chen

Simultaneous Localization and Mapping

Visual topological SLAM and global localization

Probabilistic Robotics

Probabilistic Robotics. FastSLAM

Visual Bearing-Only Simultaneous Localization and Mapping with Improved Feature Matching

Data Association for SLAM

Tutorial on 3D Surface Reconstruction in Laparoscopic Surgery. Simultaneous Localization and Mapping for Minimally Invasive Surgery

Introduction to Mobile Robotics. SLAM: Simultaneous Localization and Mapping

Augmented Reality, Advanced SLAM, Applications

SLAM with SIFT (aka Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks ) Se, Lowe, and Little

Direct Methods in Visual Odometry

Normalized graph cuts for visual SLAM

Accurate Motion Estimation and High-Precision 3D Reconstruction by Sensor Fusion

COS Lecture 13 Autonomous Robot Navigation

Robotics. Lecture 5: Monte Carlo Localisation. See course website for up to date information.

Kalman Filter Based. Localization

Hybrids Mixed Approaches

SLAM for Robotic Assistance to fire-fighting services

Robotics. Chapter 25. Chapter 25 1

Visual SLAM. An Overview. L. Freda. ALCOR Lab DIAG University of Rome La Sapienza. May 3, 2016

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

A Constant-Time Efficient Stereo SLAM System

Robot Localisation and Mapping with Stereo Vision

Long-term motion estimation from images

Localization and Map Building

Augmenting RatSLAM using FAB-MAP-based Visual Data Association

Introduction to SLAM Part II. Paul Robertson

Improved Appearance-Based Matching in Similar and Dynamic Environments using a Vocabulary Tree

Simultaneous Localization and Mapping (SLAM)

Hidden View Synthesis using Real-Time Visual SLAM for Simplifying Video Surveillance Analysis

Graph-Based SLAM and Open Source Tools. Giorgio Grisetti

Robot Mapping. SLAM Front-Ends. Cyrill Stachniss. Partial image courtesy: Edwin Olson 1

Monocular SLAM with Inverse Scaling Parametrization

Indoor Positioning System Based on Distributed Camera Sensor Networks for Mobile Robot

Unified Loop Closing and Recovery for Real Time Monocular SLAM

Estimating Geospatial Trajectory of a Moving Camera

Simultaneous Localization

Simultaneous Localization and Mapping! SLAM

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

(W: 12:05-1:50, 50-N202)

Real-Time Global Localization with A Pre-Built Visual Landmark Database

Ensemble of Bayesian Filters for Loop Closure Detection

IROS 05 Tutorial. MCL: Global Localization (Sonar) Monte-Carlo Localization. Particle Filters. Rao-Blackwellized Particle Filters and Loop Closing

Probabilistic Robotics

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Offline Simultaneous Localization and Mapping (SLAM) using Miniature Robots

Incremental topo-metric SLAM using vision and robot odometry

Autonomous Navigation for Flying Robots

Visual Recognition and Search April 18, 2008 Joo Hyun Kim

Perception. Autonomous Mobile Robots. Sensors Vision Uncertainties, Line extraction from laser scans. Autonomous Systems Lab. Zürich.

Indoor Localization Algorithms for a Human-Operated Backpack System

A Sparse Hybrid Map for Vision-Guided Mobile Robots

Integrated Laser-Camera Sensor for the Detection and Localization of Landmarks for Robotic Applications

Towards Gaussian Multi-Robot SLAM for Underwater Robotics

What is the SLAM problem?

Visual SLAM for small Unmanned Aerial Vehicles

Collecting outdoor datasets for benchmarking vision based robot localization

6DOF SLAM with Stereo-in-hand

Appearance-based Visual Localisation in Outdoor Environments with an Omnidirectional Camera

Robotic Perception and Action: Vehicle SLAM Assignment

Davide Scaramuzza. University of Zurich

Temporally Scalable Visual SLAM using a Reduced Pose Graph

Large Scale Visual Odometry for Rough Terrain

Humanoid Robotics. Monte Carlo Localization. Maren Bennewitz

Scale Invariant Segment Detection and Tracking

Real-Time Monocular SLAM with Straight Lines

An image to map loop closing method for monocular SLAM

Introduction to Mobile Robotics SLAM Grid-based FastSLAM. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Diego Tipaldi, Luciano Spinello

Filtering and mapping systems for underwater 3D imaging sonar

MSc. Thesis dissertation: 3D Visual SLAM applied to large-scale underwater scenarios

Vision-Based Markov Localization Across Large Perceptual Changes

08 An Introduction to Dense Continuous Robotic Mapping

Jakob Engel, Thomas Schöps, Daniel Cremers Technical University Munich. LSD-SLAM: Large-Scale Direct Monocular SLAM

Real-Time Model-Based SLAM Using Line Segments

Integration of 3D Lines and Points in 6DoF Visual SLAM by Uncertain Projective Geometry

Lecture: Autonomous micro aerial vehicles

Kaijen Hsiao. Part A: Topics of Fascination

Loop detection and extended target tracking using laser data

Sampling based Bundle Adjustment using Feature Matches between Ground-view and Aerial Images

Robotic Mapping. Outline. Introduction (Tom)

LOCAL AND GLOBAL DESCRIPTORS FOR PLACE RECOGNITION IN ROBOTICS

Autonomous Navigation for Flying Robots

Mini-SLAM: Minimalistic Visual SLAM in Large-Scale Environments Based on a New Interpretation of Image Similarity

Today MAPS AND MAPPING. Features. process of creating maps. More likely features are things that can be extracted from images:

Large-Scale Robotic SLAM through Visual Mapping

Work Smart, Not Hard: Recalling Relevant Experiences for Vast-Scale but Time-Constrained Localisation

High-precision, consistent EKF-based visual-inertial odometry

Image Augmented Laser Scan Matching for Indoor Localization

A Visual Compass based on SLAM

Removing Moving Objects from Point Cloud Scenes

Master Automática y Robótica. Técnicas Avanzadas de Vision: Visual Odometry. by Pascual Campoy Computer Vision Group

On the Use of Inverse Scaling in Monocular SLAM

Transcription:

Robotics Lecture 7: Simultaneous Localisation and Mapping (SLAM) See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London

Review: Practical 6 Location 1 Location 2 Location 3 Location 4 Location 5 Location 6 Need repeatable spin and measurement. Recognising orientation too will be computationally costly without invariant descriptors.

Simultaneous Localisation and Mapping One of the big successes of probabilistic robotics. A body with quantitative sensors moves through a previously unknown, static environment, mapping it and calculating its egomotion. When do we need SLAM? When a robot must be truly autonomous (no human input). When nothing is known in advance about the environment. When we can t place beacons (or even use GPS like indoors or underwater). And when the robot actually needs to know where it is. In SLAM we build a map incrementally, and localise with respect to that map as it grows and is gradually refined.

Features for SLAM Most SLAM algorithms make maps of natural scene features. Laser/sonar: line segments, 3D planes, corners, etc. Vision: salient point features, lines, textured surfaces. Features should be distinctive; recognisable from different viewpoints (data association).

Propagating Uncertainty SLAM seems like a chicken and egg problem but we can make progress if we assume the robot is the only thing that moves. Main assumption: the world is static. We extend probabilistic estimation (from just the robot state as in MCL) to the features of the map as well. In SLAM we store and update a joint distribution over the states of both the robot and the mapped world. New features are gradually discovered as the robot explores so the dimension of this joint estimation problem will grow.

Simultaneous Localisation and Mapping B C A (a) Robot start (zero uncertainty); first measurement of feature A.

Simultaneous Localisation and Mapping (b) Robot drives forwards (uncertainty grows).

Simultaneous Localisation and Mapping (c) Robot first observes B and C: they inherit its uncertainty.

Simultaneous Localisation and Mapping (d) Robot drives back towards start (uncertainty grows more)

Simultaneous Localisation and Mapping (e) Robot re-measures A; loop closure! Uncertainty shrinks.

Simultaneous Localisation and Mapping (f) Robot re-measures B; note that uncertainty of C also shrinks.

Simultaneous Localisation and Mapping First Order Uncertainty Propagation ˆx = ˆx v ŷ 1 ŷ 2., P = P xx P xy1 P xy2... P y1x P y1y 1 P y1y 2... P y2x P y2y 1 P y2y 2... x v is robot state, e.g. (x, y, θ) in 2D; y i is feature state, e.g. (X, Y ) in 2D. PDF over robot and map parameters is modelled as a single multi-variate Gaussian and we can use the Extended Kalman Filter. PDF represented with state vector and covariance matrix....

SLAM Using Active Vision Stereo active vision; 3-wheel robot base. Automatic fixated active mapping and measurement of arbitrary scene features. Sparse mapping.

SLAM Using Active Stereo Vision z z 1 0 x x

SLAM with Ring of Sonars Newman, Leonard, Neira and Tardós, ICRA 2002

SLAM with a Single Camera Davison, ICCV 2003; Davison, Molton, Reid, Stasse, PAMI 2007.

Limits of Metric SLAM z z 1 0 x x Purely metric probabilistic SLAM is limited to small domains due to: Poor computational scaling of probabilistic filters. Growth in uncertainty at large distances from map origin makes representation of uncertainty inaccurate. Data Association (matching features) gets hard at high uncertainty.

Large Scale Localisation and Mapping Local Metric Place Recognition Global Optimisation Practical modern solutions to large scale mapping follow a metric/topological approach which approximates full metric SLAM. They need the following elements: Local metric mapping to estimate trajectory and make local maps. Place recognition, to perform loop closure or relocalise the robot when lost. Map optimisation/relaxation to optimise a map when loops are closed.

Global Topological: Loop Closure Detection Angeli et al., IEEE Transactions on Robotics 2008.

Pure Topological SLAM Graph-based representation. Segmentation of the environment into linked distinct places. Adapted to symbolic planning and navigation. Figure: Topological representation

Pure Topological SLAM Map defined as a graph of connected locations. Edges model relationships between locations (e.g. traversability, similarity).

Indoor Topological Map

Mixed Indoor / Outdoor Topological Map, Several Levels

Adding Metric Information to the Graph Edges The edges between linked nodes are annotated with relative motion information; could be from local mapping or purely incremental information like odometry or visual odometry. Apply pose graph optimisation (relaxation) algorithm, which computes the set of node positions which is maximally probable given both the metric and topological constraints. Pose graph optimisation only has an effect when there are loops in the graph.

Example Relaxation Algorithm 1. Estimate position and variance of node i from each neighboring node j: (x i ) j = x j + d ji cos(θ ji + θ j ) (y i ) j = y j + d ji sin(θ ji + θ j ) (θ i ) j = θ j + ϕ ji (1) (v i ) j = v j + v ji (2) 2. Estimate variance of node i using harmonic mean of estimates from neighbors (n i = number of neighbors of node i): v i = n i j 1 (v i ) j (3) 3. Estimate position of node i as the mean of the estimates from its neighbors: x i = 1 n i j (x i ) j v i (v i ) j y i = 1 n i j (y i ) j v i (v i ) j θ i = arctan sin((θ i ) j ) j (v i ) j j cos((θ i ) j ) (v i ) j (4)

Relaxation Algorithm (Duckett, 2000): Illustration The position and orientation of node j is obtained as the mean of the positions obtained from nodes i and k (i.e., by composing their positions with the corresponding relative displacements to node j).

Map Relaxation: Good Odometry, One Loop Closure

Simple Large-Scale SLAM: RATSLAM Milford and Wyeth, 2007. http://www.youtube.com/watch?v=-0xsui69yvs Very simple visual odometry gives rough trajectory. Simple visual place recognition provides many loop closures. Map relaxation/optimisation to build global map.

More Information about SLAM If you want to find out more about SLAM there is plenty of good information and open source software available online; e.g.: Visual/Monocular SLAM: SceneLib2, PTAM, ScaViSLAM Pose Graph Optimisation: g2o, Ceres Many others: www.openslam.org