Lecture 13 Visual Inertial Fusion

Similar documents
Autonomous Navigation for Flying Robots

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

Visual-Inertial Localization and Mapping for Robot Navigation

Monocular Visual-Inertial SLAM. Shaojie Shen Assistant Professor, HKUST Director, HKUST-DJI Joint Innovation Laboratory

navigation Isaac Skog

Vol agile avec des micro-robots volants contrôlés par vision

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

High-precision, consistent EKF-based visual-inertial odometry

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Computationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor

Camera Drones Lecture 2 Control and Sensors

State-space models for 3D visual tracking

A Loosely-Coupled Approach for Metric Scale Estimation in Monocular Vision-Inertial Systems

UAV Autonomous Navigation in a GPS-limited Urban Environment

L15. POSE-GRAPH SLAM. NA568 Mobile Robotics: Methods & Algorithms

Stable Vision-Aided Navigation for Large-Area Augmented Reality

Satellite and Inertial Navigation and Positioning System

Inertial Navigation Systems

VINet: Visual-Inertial Odometry as a Sequence-to-Sequence Learning Problem

CSE 527: Introduction to Computer Vision

Selection and Integration of Sensors Alex Spitzer 11/23/14

Autonomous Mobile Robot Design

E80. Experimental Engineering. Lecture 9 Inertial Measurement

Quaternion Kalman Filter Design Based on MEMS Sensors

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models

Camera and Inertial Sensor Fusion

Error Simulation and Multi-Sensor Data Fusion

Live Metric 3D Reconstruction on Mobile Phones ICCV 2013

A novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models

(1) and s k ωk. p k vk q

Introduction to Inertial Navigation (INS tutorial short)

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

CHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS

Strapdown Inertial Navigation Technology

An Intro to Gyros. FTC Team #6832. Science and Engineering Magnet - Dallas ISD

Autonomous Navigation for Flying Robots

Testing the Possibilities of Using IMUs with Different Types of Movements

Marker Based Localization of a Quadrotor. Akshat Agarwal & Siddharth Tanwar

Strapdown Inertial Navigation Technology, Second Edition D. H. Titterton J. L. Weston

Inertial Measurement for planetary exploration: Accelerometers and Gyros

Inertial Measurement Units I!

CVPR 2014 Visual SLAM Tutorial Efficient Inference

Computer Animation and Visualisation. Lecture 3. Motion capture and physically-based animation of characters

Map-Based Visual-Inertial Monocular SLAM using Inertial assisted Kalman Filter

TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU

DriftLess Technology to improve inertial sensors

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

EE565:Mobile Robotics Lecture 2

Exam in DD2426 Robotics and Autonomous Systems

A General Framework for Mobile Robot Pose Tracking and Multi Sensors Self-Calibration

Low Cost solution for Pose Estimation of Quadrotor

Review Paper: Inertial Measurement

Distributed Vision-Aided Cooperative Navigation Based on Three-View Geometry

Removing Scale Biases and Ambiguity from 6DoF Monocular SLAM Using Inertial

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

INTEGRATED TECH FOR INDUSTRIAL POSITIONING

Calibration and Noise Identification of a Rolling Shutter Camera and a Low-Cost Inertial Measurement Unit

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education

Monocular Visual-Inertial SLAM: Continuous Preintegration and Reliable Initialization

Augmented Reality, Advanced SLAM, Applications

New paradigm for MEMS+IC Co-development

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

Direct Methods in Visual Odometry

Egomotion Estimation by Point-Cloud Back-Mapping

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

4D Crop Analysis for Plant Geometry Estimation in Precision Agriculture

Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design

A Kalman Filter Based Attitude Heading Reference System Using a Low Cost Inertial Measurement Unit

Hybrids Mixed Approaches

ROBOTICS AND AUTONOMOUS SYSTEMS

Robotics and Autonomous Systems

Robotics and Autonomous Systems

Me 3-Axis Accelerometer and Gyro Sensor

Incorporation of a Foot-Mounted IMU for Multi-Sensor Pedestrian Navigation. Daniel Pierce

Relative Pose of IMU-Camera Calibration Based on BP Network

Electronics Design Contest 2016 Wearable Controller VLSI Category Participant guidance

Strapdown Inertial Navigation Technology. Second Edition. Volume 207 PROGRESS IN ASTRONAUTICS AND AERONAUTICS

Inertial Measurement Units II!

GPS denied Navigation Solutions

3D Model Acquisition by Tracking 2D Wireframes

Visual Odometry. Features, Tracking, Essential Matrix, and RANSAC. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

The Performance Evaluation of the Integration of Inertial Navigation System and Global Navigation Satellite System with Analytic Constraints

Electronic Letters on Science & Engineering 11(2) (2015) Available online at

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Satellite/Inertial Navigation and Positioning System (SINAPS)

Fusion of unsynchronized optical tracker and inertial sensor in EKF framework for in-car Augmented Reality delay reduction

Control of a quadrotor manipulating a beam (2 projects available)

Research Article An Intuitive Approach to Inertial Sensor Bias Estimation

MULTI-SENSOR DATA FUSION FOR LAND VEHICLE ATTITUDE ESTIMATION USING A FUZZY EXPERT SYSTEM

Visual SLAM for small Unmanned Aerial Vehicles

GTSAM 4.0 Tutorial Theory, Programming, and Applications

Use of Image aided Navigation for UAV Navigation and Target Geolocation in Urban and GPS denied Environments

Low-Cost IMU Implementation via Sensor Fusion Algorithms in the. Arduino Environment

Sensor technology for mobile robots

Robotics: Science and Systems

Cooperative Inchworm Localization with a Low Cost Heterogeneous Team

Modeling, Parameter Estimation, and Navigation of Indoor Quadrotor Robots

1. Introduction 1 2. Mathematical Representation of Robots

Transcription:

Lecture 13 Visual Inertial Fusion Davide Scaramuzza

Course Evaluation Please fill the evaluation form you received by email! Provide feedback on Exercises: good and bad Course: good and bad How to improve

Lab Exercise 6 - Today Room ETH HG E 33.1 from 14:15 to 16:00 Work description: Bundle Adjustment

Outline Introduction IMU model and Camera-IMU system Different paradigms Filtering Maximum a posteriori estimation Fix-lag smoothing 4

What is an IMU? Inertial Measurement Unit Angular velocity Linear Accelerations 5

What is an IMU? Different categories Mechanical Optical MEMS. For mobile robots: MEMS IMU Cheap Power efficient Light weight and solid state 6

MEMS Accelerometer A spring-like structure connects the device to a seismic mass vibrating in a capacity devider. A capacitive divider converts the displacement of the seismic mass into an electric signal. Damping is created by the gas sealed in the device. capacitive divider spring M a M a M

MEMS Gyroscopes MEMS gyroscopes measure the Coriolis forces acting on MEMS vibrating structures (tuning forks, vibrating wheels, or resonant solids) Their working principle is similar to the haltere of a fly Haltere are small structures of some two-winged insects, such as flies. They are flapped rapidly and function as gyroscopes, informing the insect about rotation of the body during flight. 8

Why IMU? Monocular vision is scale ambiguous. Pure vision is not robust enough Low texture High dynamic range High speed motion Robustness is a critical issue: Tesla accident The autopilot sensors on the Model S failed to distinguish a white tractor-trailer crossing the highway against a bright sky. 9

Why vision? Pure IMU integration will lead to large drift (especially cheap IMUs) Will see later mathematically Intuition - Integration of angular velocity to get orientation: error proportional to t - Double integration of acceleration to get position: if there is a bias in acceleration, the error of position is proportional to t 2 - Worse, the actually position error also depends on the error of orientation. Smartphone accelerometers http://www.vectornav.com/support/library/imu-and-ins 10

Why visual inertial fusion? Summary: IMU and vision are complementary Visual sensor Precise in case of non-aggressive motion Rich information for other purposes Inertial sensor Robust High output rate (~1,000 Hz) ᵡ ᵡ ᵡ Limited output rate (~100 Hz) Scale ambiguity in monocular setup. Lack of robustness ᵡ ᵡ Large relative uncertainty when at low acceleration/angular velocity Ambiguity in gravity / acceleration In common: state estimation based on visual or/and inertial sensor is dead-reckoning, which suffers from drifting over time. (solution: loop detection and loop closure) 11

Outline Introduction IMU model and Camera-IMU system Different paradigms Filtering Maximum a posteriori estimation Fix-lag smoothing 12

IMU model: Measurement Model Measures angular velocity and acceleration in the body frame: ω t ω t b t n t g g B WB B WB B WB BW W WB w a a a t R t a t g b t n t measurements noise where the superscript g stands for Gyroscope and a for Accelerometer Notations: Left subscript: reference frame in which the quantity is expressed Right subscript {Q}{Frame1}{Frame2}: Q of Frame2 with respect to Frame1 Noises are all in the body frame 13

IMU model: Noise Property Additive Gaussian white noise: E n t 0 2 E n t n t t t Bias: 1 2 1 2 b g b() t bw t t, b a t i.e., the derivative of the bias is white Gaussian noise (so-called random walk) n Trawny, Nikolas, and Stergios I. Roumeliotis. "Indirect Kalman filter for 3D attitude estimation. https://github.com/ethz-asl/kalibr/wiki/imu-noise-model g t, n a t d w k ~ N(0,1) n k w k d / t k k 1 k b b w bd w k b t ~ N(0,1) The biases are usually estimated with the other states can change every time the IMU is started can change due to temperature change, mechanical pressure, etc. bd 14

IMU model: Integration Per component: {t} stands for {B}ody frame at time t t2 Wt Wt 2 1 Wt Wt w d a p p t t v R t a t b t g t 2 1 1 Depends on initial position and velocity The rotation R(t) is computed from the gyroscope t 1 2 Trawny, Nikolas, and Stergios I. Roumeliotis. "Indirect Kalman filter for 3D attitude estimation." 16

Camera-IMU System There can be multiple cameras. 17

Outline Introduction IMU model and Camera-IMU system Different paradigms Closed-form solution Filtering Fix-lag smoothing Maximum a posteriori estimation 18

Closed-form Solution (intuitive idea) The absolute pose x is known up to a scale s, thus x = sx From the IMU x = x 0 + v 0 (t 1 t 0 ) + t 1 a t dt t 0 By equating them t 1 sx = x 0 + v 0 t 1 t 0 + a t dt t 0 As shown in [Martinelli 14], for 6DOF, both s and v 0 can be determined in closed form from a single feature observation and 3 views. x 0 can be set to 0. Martinelli, Closed-form solution of visual-inertial structure from motion, International Journal of Computer Vision, 2014 19

Closed-form Solution t 1 sx 1 = v 0 t 1 t 0 + a t dt t 0 t 2 sx 2 = v 0 t 2 t 0 + a t dt t 0 t 0 t 1 t 2 L 1 x 1 (t 0 t 1 ) x 2 (t 0 t 2 ) s v 0 = t 1 a t dt t 0 2 t 0 a t dt Martinelli, Closed-form solution of visual-inertial structure from motion, International Journal of Computer Vision, 2014 20

Different paradigms Loosely coupled: use the output of individual system Estimate the states individually from visual and inertial data Combine the separate states estimations Tightly coupled: use the internal states Make use of the raw measurements - Feature positions - IMU readings - Example: - Use IMU for guided feature matching - Minimizing reprojection error and IMU error together - More accurate More implementation effort. 21

Different paradigms images Feature Extraction Feature Matching Feature correspondence Motion Estimation position attitude IMU measurements IMU Integration position attitude velocity Loosely coupled Tightly coupled More accurate More implementation effort. 22

Different paradigms Filtering Fix-lag Smoothing Maximum-A-Posteriori (MAP) Estimation Filtering the most recent states (e.g., extended Kalman filter) Optimize window of states Marginalization Nonlinear least squares optimization Optimize all states Nonlinear Least squares optimization 1 Linearization Accumulation of linearization errors Gaussian approximation of marginalized states Faster Re-Linearize Accumulation of linearization errors Gaussian approximation of marginalized states Fast Re-Linearize Sparse Matrices Highest Accuracy Slow 23

Filtering: Visual Inertial Formulation System states: Tightly coupled: a g t t t t t X Wp ; qwb ; Wv ; b ; b ; WL1; WL2;...,; W L Loosely coupled: ; ; ; a g X p t q t v t b t ; b t W WB W K Process Model: from IMU Integration of IMU states (rotation, position, velocity) Propagation of IMU noise needed for calculating the Kalman Filter gain 26

Filtering: ROVIO Use pixel intensities as measurements. Bloesch, Michael, et al. "Robust visual inertial odometry using a direct EKF-based approach." 28

Filtering: Potential Problems Wrong linearization point Linearization depends on the current estimates of states, which may be erroneous Linearization around different values of the same variable leads to estimator inconsistency (wrong observability/covariance estimation) Wrong covariance/initial states Intuitively, wrong weights for measurements and prediction May be overconfident/underconfident Explosion of number of states Each 3D point: 3 variables 30

Filtering: MSCKF (Multi-State Constraint Kalman Filter): used in Google Tango Key idea: Keep a window of recent states incorporate visual observations without including point positions into the states Li, Mingyang, and Anastasios I. Mourikis. "High-precision, consistent EKF-based visual inertial odometry." 31

Filtering: Google Tango

Optimization-based Approaches (MAP, Fix-lag smoothing) Fusion solved as a non-linear optimization problem Increased accuracy over filtering methods x k f ( x ) k 1 z h( x, l ) i i i k j X = {x 1, x N }: robot states L = {l 1, }: 3D points Z = {z i, z M }: feature positions * * { X, L } argmax P( X, L Z) { X, L} { X, L} N argmin f ( x ) x h( x, l ) z 2 k 1 k i 1 k 1 k i j i k i M i 2 Forster, Carlone, Dellaert, Scaramuzza, IMU Preintegration on Manifold for efficient Visual-Inertial Maximum-a- Posteriori Estimation, Robotics Science and Systens 15, Best Paper Award Finalist

Optimization-based Approaches (MAP, Fix-lag smoothing) Fusion solved as a non-linear optimization problem Increased accuracy over filtering methods * * { X, L } argmax P( X, L Z) { X, L} { X, L} N argmin f ( x ) x h( x, l ) z 2 k 1 k i 1 k 1 k i j i k i M i 2 Forster, Carlone, Dellaert, Scaramuzza, IMU Preintegration on Manifold for efficient Visual-Inertial Maximum-a- Posteriori Estimation, Robotics Science and Systens 15, Best Paper Award Finalist

Fix-lag smoothing: OKVIS

Optimization-based Approaches (MAP, Fix-lag smoothing) Fusion solved as a non-linear optimization problem Increased accuracy over filtering methods IMU residuals Reprojection residuals Forster, Carlone, Dellaert, Scaramuzza, IMU Preintegration on Manifold for efficient Visual-Inertial Maximum-a- Posteriori Estimation, Robotics Science and Systens 15, Best Paper Award Finalist 47

MAP: SVO + IMU Preintegration Open Source Accuracy: 0.1% of the travel distance Proposed Google Tango OKVIS Forster, Carlone, Dellaert, Scaramuzza, IMU Preintegration on Manifold for efficient Visual-Inertial Maximum-a- Posteriori Estimation, Robotics Science and Systens 15 and IEEE TRO 16, Best Paper Award Finalist 48

MAP: SVO + IMU Preintegration 49

Further topics Rotation parameterization Rotation is tricky to deal with Euler angle / Rotation matrix / Quaternion / SO(3) Consistency: filtering and fix-lag smoothing Linearization around different values of the same variable may lead to error 50