Robotics. Lecture 8: Simultaneous Localisation and Mapping (SLAM)

Similar documents
Robotics. Lecture 7: Simultaneous Localisation and Mapping (SLAM)

Autonomous Mobile Robot Design

Simuntaneous Localisation and Mapping with a Single Camera. Abhishek Aneja and Zhichao Chen

Simultaneous Localization and Mapping

CSE 527: Introduction to Computer Vision

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History

Offline Simultaneous Localization and Mapping (SLAM) using Miniature Robots

Data Association for SLAM

15 Years of Visual SLAM

Introduction to Mobile Robotics. SLAM: Simultaneous Localization and Mapping

Visual topological SLAM and global localization

COS Lecture 13 Autonomous Robot Navigation

Probabilistic Robotics

Probabilistic Robotics. FastSLAM

Localization and Map Building

Introduction to SLAM Part II. Paul Robertson

Accurate Motion Estimation and High-Precision 3D Reconstruction by Sensor Fusion

Kalman Filter Based. Localization

Improved Appearance-Based Matching in Similar and Dynamic Environments using a Vocabulary Tree

Loop detection and extended target tracking using laser data

Long-term motion estimation from images

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

What is the SLAM problem?

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models

Autonomous Navigation for Flying Robots

Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation

Robot Mapping. SLAM Front-Ends. Cyrill Stachniss. Partial image courtesy: Edwin Olson 1

Visual Bearing-Only Simultaneous Localization and Mapping with Improved Feature Matching

Robotics. Lecture 5: Monte Carlo Localisation. See course website for up to date information.

SLAM with SIFT (aka Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks ) Se, Lowe, and Little

Visual SLAM. An Overview. L. Freda. ALCOR Lab DIAG University of Rome La Sapienza. May 3, 2016

Visual Recognition and Search April 18, 2008 Joo Hyun Kim

Direct Methods in Visual Odometry

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Perception. Autonomous Mobile Robots. Sensors Vision Uncertainties, Line extraction from laser scans. Autonomous Systems Lab. Zürich.

08 An Introduction to Dense Continuous Robotic Mapping

Stable Vision-Aided Navigation for Large-Area Augmented Reality

with respect to some 3D object that the CAD model describes, for the case in which some (inexact) estimate of the camera pose is available. The method

Robot Localisation and Mapping with Stereo Vision

Augmented Reality, Advanced SLAM, Applications

Stereo Vision as a Sensor for EKF SLAM

Robotic Perception and Action: Vehicle SLAM Assignment

High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications

UNIVERSITÀ DEGLI STUDI DI GENOVA MASTER S THESIS

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

(W: 12:05-1:50, 50-N202)

Robot Localization based on Geo-referenced Images and G raphic Methods

arxiv: v1 [cs.cv] 18 Sep 2017

Localization, Where am I?

Integrated Laser-Camera Sensor for the Detection and Localization of Landmarks for Robotic Applications

Robotics. Chapter 25. Chapter 25 1

High-precision, consistent EKF-based visual-inertial odometry

Simultaneous surface texture classification and illumination tilt angle prediction

ME 456: Probabilistic Robotics

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

Ensemble of Bayesian Filters for Loop Closure Detection

A Comparison of Position Estimation Techniques Using Occupancy Grids

Quasi-thematic Features Detection & Tracking. Future Rover Long-Distance Autonomous Navigation

Simultaneous Localization and Mapping! SLAM

Today MAPS AND MAPPING. Features. process of creating maps. More likely features are things that can be extracted from images:

Integration of 3D Lines and Points in 6DoF Visual SLAM by Uncertain Projective Geometry

3D Simultaneous Localisation and Map-Building Using Active Vision for a Robot Moving on Undulating Terrain

Introduction to Autonomous Mobile Robots

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS

Uncertainties: Representation and Propagation & Line Extraction from Range data

Topological Mapping. Discrete Bayes Filter

Real-Time Global Localization with A Pre-Built Visual Landmark Database

Indoor Positioning System Based on Distributed Camera Sensor Networks for Mobile Robot

Davide Scaramuzza. University of Zurich

7630 Autonomous Robotics Probabilities for Robotics

Collecting outdoor datasets for benchmarking vision based robot localization

3D Simultaneous Localisation and Map-Building Using Active Vision for a Robot Moving on Undulating Terrain

Sensor Modalities. Sensor modality: Different modalities:

Implementing the Scale Invariant Feature Transform(SIFT) Method

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

SIFT: SCALE INVARIANT FEATURE TRANSFORM SURF: SPEEDED UP ROBUST FEATURES BASHAR ALSADIK EOS DEPT. TOPMAP M13 3D GEOINFORMATION FROM IMAGES 2014

An Overview to Visual Odometry and Visual SLAM: Applications to Mobile Robotics

Simultaneous Localization and Mapping (SLAM)

Visual SLAM for small Unmanned Aerial Vehicles

Lecture 20: The Spatial Semantic Hierarchy. What is a Map?

Appearance-based Visual Localisation in Outdoor Environments with an Omnidirectional Camera

Autonomous Navigation for Flying Robots

Visual Place Recognition using HMM Sequence Matching

Advanced Techniques for Mobile Robotics Bag-of-Words Models & Appearance-Based Mapping

Towards Gaussian Multi-Robot SLAM for Underwater Robotics

Robot Localization: Historical Context

Simultaneous Localization and Mapping with Monocamera: Detection of Landing Zones

ECE 497 Introduction to Mobile Robotics Spring 09-10

Planetary Rover Absolute Localization by Combining Visual Odometry with Orbital Image Measurements

Master Automática y Robótica. Técnicas Avanzadas de Vision: Visual Odometry. by Pascual Campoy Computer Vision Group

ME/CS 132: Introduction to Vision-based Robot Navigation! Low-level Image Processing" Larry Matthies"

Navigation methods and systems

Simultaneous Localization

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018

Work Smart, Not Hard: Recalling Relevant Experiences for Vast-Scale but Time-Constrained Localisation

SYDE Winter 2011 Introduction to Pattern Recognition. Clustering

Localization of Multiple Robots with Simple Sensors

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

1 Introduction. 2 Real-time Omnidirectional Stereo

A Sparse Hybrid Map for Vision-Guided Mobile Robots

Transcription:

Robotics Lecture 8: Simultaneous Localisation and Mapping (SLAM) See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London

Review: Practical 7 Location 1 Location 2 Location 3 Location 4 Location 5 Location 6 Need repeatable spin and measurement. Recognising orientation too will be computationally costly without invariant descriptors.

Simultaneous Localisation and Mapping One of the big successes of probabilistic robotics. A body with quantitative sensors moves through a previously unknown, static environment, mapping it and calculating its egomotion. When do we need SLAM? When a robot must be truly autonomous (no human input). When nothing is known in advance about the environment. When we can t place beacons (or even use GPS like indoors or underwater). And when the robot actually needs to know where it is.

Features for SLAM Most SLAM algorithms make maps of natural scene features. Laser/sonar: line segments, 3D planes, corners, etc. Vision: salient point features, lines, textured surfaces. Features should be distinctive; recognisable from different viewpoints (data association).

Propagating Uncertainty SLAM seems like a chicken and egg problem but we can make progress if we assume the robot is the only thing that moves. Main assumption: the world is static.

Simultaneous Localisation and Mapping B C A (a) Robot start (zero uncertainty); first measurement of feature A.

Simultaneous Localisation and Mapping (b) Robot drives forwards (uncertainty grows).

Simultaneous Localisation and Mapping (c) Robot makes first measurements of B and C.

Simultaneous Localisation and Mapping (d) Robot drives back towards start (uncertainty grows more)

Simultaneous Localisation and Mapping (e) Robot re-measures A; loop closure! Uncertainty shrinks.

Simultaneous Localisation and Mapping (f) Robot re-measures B; note that uncertainty of C also shrinks.

Simultaneous Localisation and Mapping First Order Uncertainty Propagation ˆx = ˆx v ŷ 1 ŷ 2., P = P xx P xy1 P xy2... P y1x P y1y 1 P y1y 2... P y2x P y2y 1 P y2y 2... x v is robot state, e.g. (x, y, θ) in 2D; y i is feature state, e.g. (X, Y ) in 2D. PDF over robot and map parameters is modelled as a single multi-variate Gaussian and we can use the Extended Kalman Filter. PDF represented with state vector and covariance matrix....

SLAM Using Active Vision Stereo active vision; 3-wheel robot base. Automatic fixated active mapping and measurement of arbitrary scene features. Sparse mapping.

Limits of Metric SLAM z z 1 0 x x Purely metric probabilistic SLAM is limited to small domains due to: Poor computational scaling of probabilistic filters. Growth in uncertainty at large distances from map origin makes representation of uncertainty inaccurate. Data Association (matching features) gets hard at high uncertainty.

Large Scale Localisation and Mapping Local Metric Place Recognition Global Optimisation Practical modern solutions to large scale mapping follow a metric/topological approach. They need the following elements: Local metric mapping to estimate trajectory and possibly make local maps. Place recognition, to perform loop closure or relocalise the robot when lost. Map optimisation/relaxation to optimise a map when loops are closed.

Global Topological: Loop Closure Detection Angeli et al., IEEE Transactions on Robotics 2008.

Pure Topological SLAM Graph-based representation. Segmentation of the environment into linked distinct places. Adapted to symbolic planning and navigation. Figure: Topological representation

Environment Model Map defined as a graph of connected locations. Edges model relationships between locations (e.g. traversability, similarity).

Indoor Topological Map

Mixed Indoor / Outdoor Topological Map, Several Levels

Adding Metric Information on the Edges Take advantage of odometry measurements from a wheeled robot to add relative displacement information between nodes. Apply simple graph-relaxation algorithm. to compute accurate 2D absolute positions for the nodes.

Relaxation Algorithm 1. Estimate position and variance of node i from each neighboring node j: (x i ) j = x j + d ji cos(θ ji + θ j ) (y i ) j = y j + d ji sin(θ ji + θ j ) (θ i ) j = θ j + ϕ ji (1) (v i ) j = v j + v ji (2) 2. Estimate variance of node i using harmonic mean of estimates from neighbors (n i = number of neighbors of node i): v i = n i j 1 (v i ) j (3) 3. Estimate position of node i as the mean of the estimates from its neighbors: x i = 1 n i j (x i ) j v i (v i ) j y i = 1 n i j (y i ) j v i (v i ) j θ i = arctan sin((θ i ) j ) j (v i ) j j cos((θ i ) j ) (v i ) j (4)

Relaxation Algorithm (Duckett, 2000): Illustration The position and orientation of node j is obtained as the mean of the positions obtained from nodes i and k (i.e., by composing their positions with the corresponding relative displacements to node j).

Map Relaxation: Good Odometry, One Loop Closure

Simple Large-Scale SLAM: RATSLAM Milford and Wyeth, 2007. http://www.youtube.com/watch?v=-0xsui69yvs Very simple visual odometry gives rough trajectory. Simple visual place recognition provides many loop closures. Map relaxation/optimisation to build global map.