MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

Similar documents
Autonomous Mobile Robot Design

Camera Drones Lecture 2 Control and Sensors

Aerial Robotic Autonomous Exploration & Mapping in Degraded Visual Environments. Kostas Alexis Autonomous Robots Lab, University of Nevada, Reno

CSE 527: Introduction to Computer Vision

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

W4. Perception & Situation Awareness & Decision making

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models

DD2426 Robotics and Autonomous Systems Lecture 4: Robot Sensors and Perception

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

Sensor Modalities. Sensor modality: Different modalities:

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

UAV Autonomous Navigation in a GPS-limited Urban Environment

Egomotion Estimation by Point-Cloud Back-Mapping

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

Camera and Inertial Sensor Fusion

Computationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor

Rigorous Scan Data Adjustment for kinematic LIDAR systems

Depth Camera for Mobile Devices

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018

Stable Vision-Aided Navigation for Large-Area Augmented Reality

Sensor technology for mobile robots

Autonomous Navigation for Flying Robots

Introduction to Autonomous Mobile Robots

Lost! Leveraging the Crowd for Probabilistic Visual Self-Localization

Monocular Visual Odometry

Augmented Reality, Advanced SLAM, Applications

Low Cost solution for Pose Estimation of Quadrotor

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

Live Metric 3D Reconstruction on Mobile Phones ICCV 2013

Marker Based Localization of a Quadrotor. Akshat Agarwal & Siddharth Tanwar

InFuse: A Comprehensive Framework for Data Fusion in Space Robotics

Direct Methods in Visual Odometry

High-precision, consistent EKF-based visual-inertial odometry

Outline. ETN-FPI Training School on Plenoptic Sensing

OFERTA O120410PA CURRENT DATE 10/04//2012 VALID UNTIL 10/05/2012 SUMMIT XL

Monocular Visual-Inertial SLAM. Shaojie Shen Assistant Professor, HKUST Director, HKUST-DJI Joint Innovation Laboratory

Simultaneous Localization

Motion estimation of unmanned marine vehicles Massimo Caccia

Lecture: Autonomous micro aerial vehicles

Lecture 19: Depth Cameras. Visual Computing Systems CMU , Fall 2013

LOAM: LiDAR Odometry and Mapping in Real Time

Ford Campus Vision and Laser Data Set

Using infrared proximity sensors for close 2D localization and object size recognition. Richard Berglind Neonode

Me 3-Axis Accelerometer and Gyro Sensor

Vision-based navigation

Data Association for SLAM

Simultaneous Localization and Mapping (SLAM)

E80. Experimental Engineering. Lecture 9 Inertial Measurement

New Sony DepthSense TM ToF Technology

Range Sensors (time of flight) (1)

Basics of Localization, Mapping and SLAM. Jari Saarinen Aalto University Department of Automation and systems Technology

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History

IMU and Encoders. Team project Robocon 2016

Solid-State Hybrid LiDAR for Autonomous Driving Product Description

Collaboration is encouraged among small groups (e.g., 2-3 students).

Qadeer Baig, Mathias Perrollaz, Jander Botelho Do Nascimento, Christian Laugier

CSE-571 Robotics. Sensors for Mobile Robots. Beam-based Sensor Model. Proximity Sensors. Probabilistic Sensor Models. Beam-based Scan-based Landmarks

SLAM with SIFT (aka Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks ) Se, Lowe, and Little

ECE276A: Sensing & Estimation in Robotics Lecture 11: Simultaneous Localization and Mapping using a Particle Filter

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

Geometric Stereo Increases Accuracy of Depth Estimations for an Unmanned Air Vehicle with a Small Baseline Stereo System

Perception: Sensors. Autonomous Mobile Robots. Davide Scaramuzza Margarita Chli, Paul Furgale, Marco Hutter, Roland Siegwart

BBR Progress Report 006: Autonomous 2-D Mapping of a Building Floor

COMBINING MEMS-BASED IMU DATA AND VISION-BASED TRAJECTORY ESTIMATION

EE565:Mobile Robotics Lecture 2

Lecture 13 Visual Inertial Fusion

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

ECGR4161/5196 Lecture 6 June 9, 2011

Vision-Based Navigation Solution for Autonomous Indoor Obstacle Avoidance Flight

Autonomous Navigation for Flying Robots

Autonomous navigation in industrial cluttered environments using embedded stereo-vision

Dense Tracking and Mapping for Autonomous Quadrocopters. Jürgen Sturm

A novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models

Humanoid Robotics. Monte Carlo Localization. Maren Bennewitz

3D Scanning. Qixing Huang Feb. 9 th Slide Credit: Yasutaka Furukawa

Personal Navigation and Indoor Mapping: Performance Characterization of Kinect Sensor-based Trajectory Recovery

Space Robotics. Ioannis Rekleitis

H2020 Space Robotic SRC- OG4

GPS + Inertial Sensor Fusion

Visual Perception Sensors

navigation Isaac Skog

The AZUR project. Development of autonomous navigation software for urban operation of VTOL-type UAV. Yoko Watanabe

3D Time-of-Flight Image Sensor Solutions for Mobile Devices

Modelling and Simulation of the Autonomous Underwater Vehicle (AUV) Robot

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

GPS denied Navigation Solutions

New Technologies for UAV/UGV

ME132 February 3, 2011

DriftLess Technology to improve inertial sensors

Actuated Sensor Networks: a 40-minute rant on convergence of vision, robotics, and sensor networks

Depth Sensors Kinect V2 A. Fornaser

Test Report iµvru. (excerpt) Commercial-in-Confidence. imar Navigation GmbH Im Reihersbruch 3 D St. Ingbert Germany.

Stereo Rig Final Report

Public Sensing Using Your Mobile Phone for Crowd Sourcing

Indoor navigation using smartphones. Chris Hide IESSG, University of Nottingham, UK

INTEGRATED TECH FOR INDUSTRIAL POSITIONING

Localization of Autonomous Mobile Ground Vehicles in a Sterile Environment: A Survey

Transcription:

MULTI-MODAL MAPPING Robotics Day, 31 Mar 2017 Frank Mascarich, Shehryar Khattak, Tung Dang

Application-Specific Sensors Cameras TOF Cameras PERCEPTION LiDAR IMU Localization Mapping Autonomy

Robotic Perception

Robotic Perception M3U (Multi-Modal Mapping Unit) Synchronization Extendable Multi-modal Multi-camera Application-specific sensing

M3U

M3U

M3U

Camera Systems Camera Systems Monocular Camera Systems Stereo Camera Systems Used to correlate position against static environment Require precise calibration Modalities Visible Light IR UV

Alternative Spectra

Navigation Sensors Inertial Sensors: Accelerometers Gyroscopes Magnetometers (digital compass) Pressure Sensors Barometric pressure for altitude sensing Airspeed measurements

LiDAR Time of flight measurement Multiple beams Different fields of view

Application Specific Sensing Radiation Mapping 2x pin-diode detectors Mounted with a 45cm stereo baseline MCU integrates radiation pulses over time and sends data to the MPU Main processor tracks the position of the robot and annotates its map with the radiation data from the MCU Utilize radiation data to localize sources in an unknown environment Applications: Pin-Diode Detectors Security Environmental disaster monitoring Decommissioning nuclear facilities

Radiation Detection and Mapping

Synchronization IMU and Camera data is transmitted over USB Unpredictable data arrival Software Synchronization Hardware Synchronization Allows precise triggering

Visual Odometry Visual Odometry is the process of estimating the position and orientation of a camera by analyzing a sequence of camera images. Acquire Images Undistort Detect Features Track Features Estimate Motion Image 1 Image 2

Visual Odometry Disparity is the distance by which objects shift from one image to another Disparity is greater for objects that are closer Allows stereo cameras to estimate depth without motion

Visual-Inertial Odometry Performance is dependent on the quality of the scene and computational capability of the system Inertial sensors provide acceleration and angular velocity at high speed Motion is estimated by doing integration Prone to drift, so not viable over longer period, but effective over short time frames

Visual-Inertial Odometry How to combine Visual and Inertial measurements in a mathematical and probabilistic way? Works in a two-step manner: Take Extended Take (Sensor Take Weighted Kalman Statistics) Filter Average Filter Filter Average Average (EKF) Prediction Step: Propagates System model and predicts the state of the robot. Correction Step: Corrects the predicted state and updates the covariance of state parameters. Implemented EKF propagates a linearized model using IMU measurements as inputs and corrects prediction using vision measurements Estimates State of the robot: X Position, Velocity, Orientation

Visual-Inertial Odometry What can we do with this VI pipeline? Odometry Feature depth information Enables us to explore and map unknown environments Localization Mapping Addresses the famous chicken and egg problem in robotics known as Simultaneous Localization and Mapping (SLAM) Where am I? What s around me?

Visual-Inertial Odometry Odometry is prone to drift Cannot perform in visually degraded environments Mapping is limited by the camera s field of view

Velodyne Hokuyo Realsense Picoflexx Laser range sensors Characteristics Picoflexx, Realsense Velodyne, Hokuyo Size - + Weight - + FOV - + Range - + Dense point cloud + -

ICP (Iterative Closest Point)

Multi-modal approach

Visual Inertial Odometry enhanced Geometrically Stable ICP

Thank you!