Aerial Robotic Autonomous Exploration & Mapping in Degraded Visual Environments. Kostas Alexis Autonomous Robots Lab, University of Nevada, Reno

Similar documents
CS491/691: Introduction to Aerial Robotics

BadgerWorks Lectures. Sampling-based Path Planning Primer. Kostas Alexis

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

UAV Autonomous Navigation in a GPS-limited Urban Environment

Autonomous Mobile Robot Design

arxiv: v2 [cs.ro] 29 Jan 2018

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

COS Lecture 13 Autonomous Robot Navigation

Visual SLAM for small Unmanned Aerial Vehicles

What is the SLAM problem?

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018

Particle Filters. CSE-571 Probabilistic Robotics. Dependencies. Particle Filter Algorithm. Fast-SLAM Mapping

Localization and Map Building

Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

W4. Perception & Situation Awareness & Decision making

Introduction to Autonomous Mobile Robots

Localization, Where am I?

3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Hierarchical Multi-Objective Planning For Autonomous Vehicles

Robotic Perception and Action: Vehicle SLAM Assignment

Autonomous navigation in industrial cluttered environments using embedded stereo-vision

Lecture 13 Visual Inertial Fusion

Robot Mapping. SLAM Front-Ends. Cyrill Stachniss. Partial image courtesy: Edwin Olson 1

Probabilistic Robotics

Sensor Modalities. Sensor modality: Different modalities:

arxiv: v2 [cs.ro] 5 Dec 2016

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

Attack Resilient State Estimation for Vehicular Systems

Graph-based SLAM (Simultaneous Localization And Mapping) for Bridge Inspection Using UAV (Unmanned Aerial Vehicle)

Simultaneous Localization and Mapping (SLAM)

Data Association for SLAM

OpenStreetSLAM: Global Vehicle Localization using OpenStreetMaps

ROBOT TEAMS CH 12. Experiments with Cooperative Aerial-Ground Robots

Navigation methods and systems

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains

A Modular Software Framework for Eye-Hand Coordination in Humanoid Robots

L12. EKF-SLAM: PART II. NA568 Mobile Robotics: Methods & Algorithms

Simultaneous Localization

Visual-Inertial Localization and Mapping for Robot Navigation

Unmanned Aerial Vehicles

Visually Augmented POMDP for Indoor Robot Navigation

Vehicle Localization. Hannah Rae Kerner 21 April 2015

EKF Localization and EKF SLAM incorporating prior information

Robotics. Lecture 7: Simultaneous Localisation and Mapping (SLAM)

Simultaneous local and global state estimation for robotic navigation

AUTONOMOUS NAVIGATION IN COMPLEX INDOOR AND OUTDOOR ENVIRONMENTS WITH MICRO AERIAL VEHICLES. Shaojie Shen A DISSERTATION

Introduction to Mobile Robotics. SLAM: Simultaneous Localization and Mapping

Vol agile avec des micro-robots volants contrôlés par vision

ME 597/747 Autonomous Mobile Robots. Mid Term Exam. Duration: 2 hour Total Marks: 100

Towards a visual perception system for LNG pipe inspection

Simultaneous Localization and Mapping

Uncertainties: Representation and Propagation & Line Extraction from Range data

VALIDATION OF 3D ENVIRONMENT PERCEPTION FOR LANDING ON SMALL BODIES USING UAV PLATFORMS

Particle Filter in Brief. Robot Mapping. FastSLAM Feature-based SLAM with Particle Filters. Particle Representation. Particle Filter Algorithm

Robotics. Lecture 8: Simultaneous Localisation and Mapping (SLAM)

Probabilistic Robotics. FastSLAM

Anibal Ollero Professor and head of GRVC University of Seville (Spain)

A Formal Model Approach for the Analysis and Validation of the Cooperative Path Planning of a UAV Team

Advanced Robotics Path Planning & Navigation

The AZUR project. Development of autonomous navigation software for urban operation of VTOL-type UAV. Yoko Watanabe

The Use of UAV s for Gathering Spatial Information. James Van Rens CEO MAPPS Winter Conference January, 2015

Mobile Robots Summery. Autonomous Mobile Robots

SLAM II: SLAM for robotic vision-based perception

Cooperative UAVs as a tool for Aerial Inspection of the Aging Infrastructure

High-precision, consistent EKF-based visual-inertial odometry

Towards Fully-automated Driving. tue-mps.org. Challenges and Potential Solutions. Dr. Gijs Dubbelman Mobile Perception Systems EE-SPS/VCA

arxiv: v2 [cs.ro] 12 Mar 2018

Localization, Mapping and Exploration with Multiple Robots. Dr. Daisy Tang

Direct Methods in Visual Odometry

Spring 2010: Lecture 9. Ashutosh Saxena. Ashutosh Saxena

Ensemble of Bayesian Filters for Loop Closure Detection

Gaussian Processes for Robotics. McGill COMP 765 Oct 24 th, 2017

EE565:Mobile Robotics Lecture 3

ICRA 2016 Tutorial on SLAM. Graph-Based SLAM and Sparsity. Cyrill Stachniss

Perception. Autonomous Mobile Robots. Sensors Vision Uncertainties, Line extraction from laser scans. Autonomous Systems Lab. Zürich.

3D Collision Avoidance for Navigation in Unstructured Environments

Autonomous Navigation for Flying Robots

08 An Introduction to Dense Continuous Robotic Mapping

Implementation of Odometry with EKF for Localization of Hector SLAM Method

CVPR 2014 Visual SLAM Tutorial Efficient Inference

Robot Localization based on Geo-referenced Images and G raphic Methods

State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History

Real-Time Vision-Based State Estimation and (Dense) Mapping

Visual Navigation for Flying Robots Exploration, Multi-Robot Coordination and Coverage

Collaborative 6DoF Relative Pose Estimation for two UAVs with Overlapping Fields of View

Basics of Localization, Mapping and SLAM. Jari Saarinen Aalto University Department of Automation and systems Technology

A Reactive Bearing Angle Only Obstacle Avoidance Technique for Unmanned Ground Vehicles

Probabilistic Robotics

GOMSF: Graph-Optimization based Multi-Sensor Fusion for robust UAV pose estimation

Creating Affordable and Reliable Autonomous Vehicle Systems

Application questions. Theoretical questions

EE631 Cooperating Autonomous Mobile Robots

On Optimizing Command and Control Structures

Range Sensors (time of flight) (1)

Transcription:

Aerial Robotic Autonomous Exploration & Mapping in Degraded Visual Environments Kostas Alexis Autonomous Robots Lab, University of Nevada, Reno

Motivation Aerial robotic operation in GPS-denied Degraded Visual Environments (DVE) Ability to autonomously explore, monitor and map in conditions of complete darkness, haze and other DVEs. Extend the operational abilities of aerial robots. Critical step for industrial, inspection & maintenance, safety and security applications. Indicative application domains: Nuclear Site Decommissioning Oil & Gas Industry Inspection Infrastructure Inspection Surveillance, Security Monitoring

Motivation Aerial robotic operation in GPS-denied Degraded Visual Environments (DVE) Ability to autonomously explore, monitor and map in conditions of complete darkness, haze and other DVEs. Extend the operational abilities of aerial robots. Critical step for industrial, inspection & maintenance, safety and security applications. Indicative application domains: Nuclear Site Decommissioning Oil & Gas Industry Inspection Infrastructure Inspection Surveillance, Security Monitoring

Research Directions Multi-Modal sensor fusion to enable the GPS-denied localization and mapping in DVEs. A Path Planning ensemble for autonomous exploration in DVEs or optimized inspection. Active Perception approach for reliable autonomous operation subject to the challenges of visually-degraded environments. Pure volumetric exploration is not sufficient The selected viewpoints and their order heavily influences the localization of the robot.

Multi-Modal Mapping Unit Multi-modal: Visual (LED-synchronized or Near Infrared), Inertial, 3D Time-of-Flight Future extensions: sound, RADAR Hardware time-synchronization of visual-inertial data Software-synchronization of VI-depth data Intel NUC5i7RYH responsible for the localization and mappin pipeline

Multi-Modal Mapping Unit Multi-modal fusion baseline and new research direction: Baseline #1: Visual-Inertial enhanced Geometrically Stable ICP Baseline #2: Visual-Inertial / Depth odometry estimates looselycoupled through an EKF Research Direction: Multi-Modal Features Research on top of L O. Vasconcelos, E. R. Nascimento, M. F. M. Campos, KVD: Scale Invariant Keypoints by Combining Visual and Depth Data

Multi-Modal Mapping Unit Multi-modal fusion baseline and new research direction: Baseline #1: Visual-Inertial enhanced Geometrically Stable ICP Baseline #2: Visual-Inertial / Depth odometry estimates loosely-coupled through an EKF

Towards a Complete Planning Ensemble Complete Path Planning ensemble addressing the problems of optimized coverage and robustly autonomous exploration. Key challenge of Exploration in DVEs: Sensor data ill-conditioning is to be expected. Localizability-aware exploration is the key Active perception and belief propagation methods provide the theoretical grounds for our approach Future direction: Curiosity-driven exploration

Towards a Complete Planning Ensemble Complete Path Planning ensemble addressing the problems of optimized coverage and robustly autonomous exploration. Key challenge of Exploration in DVEs: Sensor data ill-conditioning is to be expected. Localizability-aware exploration is the key Active perception and belief propagation methods provide the theoretical grounds for our approach Future direction: Curiosity-driven exploration

Uncertainty-aware Exploration & Mapping Problem Definition The overall problem is that of exploring an unknown bounded 3D volume V E R 3, while aiming to minimize the localization and mapping uncertainty as evaluated through a metric over the robot pose and landmarks probabilistic belief. Problem 1: Volumetric Exploration Given a bounded volume V E, find a collision free path σ starting at an initial E configuration ξ init Ξ that leads to identifying the free and occupied parts V free and VE occ when being executed, such that there does not exist any collision free configuration from which any piece of V E E {V free, V E occ } could be perceived. Combined Problem Problem 2: Belief Uncertainty-aware planning Given a V M V E, find a collision free path σ M starting at an initial configuration ξ 0 Ξ and ending in a configuration ξ final Ξ that aims to improve the robot s localization and mapping confidence by following paths of optimized expected robot pose and tracked landmarks covariance.

Robot Configuration

Uncertainty-aware Exploration & Mapping Receding Horizon Exploration and Mapping Planner ( ) Two-steps / nested optimization Path Planning paradigm

rhemplanner - Exploration Step

rhemplanner - Exploration Step Exploration Gain with probabilistic re-observation Aiming to maximize newly explored space and reobserve space with decreased confidence of being mapped as occupied.

rhemplanner - Uncertainty-aware Step

rhemplanner - Uncertainty-aware Step The robot performs onboard localization and mapping For the case of our experiments it performs visual-inertial localization The assumptions are: Pose, features/landmarks and their uncertainties are estimated Dense, volumetric mapping takes place To get an estimate about its pose, it relies on tracking landmarks from its sensor systems. The system performs odometry in an EKF-fashion and the overall state of the filter is:

rhemplanner - Uncertainty-aware Step Belief Propagation: in order to identify the paths that minimize the robot uncertainty, a mechanism to propagate the robot belief about its pose and the tracked features has to be established. Using Michael Bloesch, Sammy Omari, Marco Hutter, Roland Siegwart, ROVIO: Robust Visual Inertial Odometry Using a Direct EKF-Based Approach, IROS 2015

rhemplanner - Uncertainty-aware Step Propagate the robot s belief about its pose and the tracked landmarks: Prediction step State Propagation Step

rhemplanner - Uncertainty-aware Step Propagate the robot s belief about its pose and the tracked landmarks: Update step Filter Update Step By stacking the above terms for all visible landmarks, standard EKF update step is directly performed to derive the new estimate of the robot belief for its state and the tracked features. Identify which viewpoints are expected to be visible given the next pose and the known map Compute the propagated belief covariance matrix

rhemplanner - Uncertainty-aware Step Propagate the robot s belief about its pose and the tracked landmarks: Update step Filter Update Step By stacking the above terms for all visible landmarks, standard EKF update step is directly performed to derive the new estimate of the robot belief for its state and the tracked features.

rhemplanner - Uncertainty-aware Step Uncertainty optimization: to be able to derive which path minimizes the robot uncertainty about its pose and the tracked landmarks, a metric of how small the covariance ellipsoid is has to be defined. What metric?

rhemplanner - Uncertainty-aware Step Uncertainty optimization: to be able to derive which path minimizes the robot uncertainty about its pose and the tracked landmarks, a metric of how small the covariance ellipsoid is has to be defined. D-optimality metric: Broadly: maximize the determinant of the information matrix X X of the design. This criterion results in maximizing the differential Shannon information content of the parameter estimates.

rhemplanner Algorithm ξ 0 current vehicle configuration First Planning Step Initialize T E with ξ 0 E g best 0 // Set best exploration gain to zero n best n 0 ξ 0 // Set best best exploration node to root N T E Number of nodes in T E While N E E T < N max E or g best == 0 do Incrementally build T E E by adding n new N T E N T E + 1 E ξ new E if ExplorationGain n new E > g best then E n new E n new E g best if N E E T > N TOT E ExplorationGain n new then Terminate planning E, n E RH, ξ RH ExtractBestPathSegment n best σ RH S ξrh LocalSet(ξ RH )

rhemplanner Algorithm E Propagate robot belief along σ RH a 1 // number of admissible paths Second Planning Step g M a BeliefGain(σ E RH ) g M M best g a // straight path belief gain σ M best M σ RH while N M T < N M max or V T M = S ξrh do Incrementally build T M M by adding n new ξ new Propagate robot belief from current to planned vertex if ξ new S ξrh then Add new vertex n M new at ξ RH and connect a a + 1 σ M α ExtractBranch(n M new ) g α M BeliefGain(σ a M ) if g M α < g M best then return σ M σ M σ a M g M best g a M

rhemplanner Complexity Analysis

rhemplanner Complexity Analysis

rhemplanner Evaluation (Experimental)

Uncertainty-aware Exploration & Mapping Kostas Alexis, Autonomous Robots Lab, 05.03.2016

Open Source contributions Open Source Code: Structural Inspection Planner: https://github.com/ethz-asl/structuralinspectionplanner Next-Best-View Planner: https://github.com/ethz-asl/nbvplanner Receding Horizon Exploration and Mapping Planner: https://github.com/unr-arl/rhemplanner Associated Datasets: Structural Inspection Planner: https://github.com/ethz-asl/structuralinspectionplanner/wiki/example-results http://changedetectiondataset.wikispaces.com/ Next-Best-View Planner: https://github.com/ethz-asl/nbvplanner/wiki/example-results Receding Horizon Exploration and Mapping Planner: https://github.com/unr-arl/icra-2017-datasets Solar-powered UAV Sensing & Mapping: http://projects.asl.ethz.ch/datasets/doku.php?id=fsr2015 Change Detection Dataset http://changedetectiondataset.wikispaces.com/

Thank you! Please ask your question!