Camera and Inertial Sensor Fusion

Similar documents
Stable Vision-Aided Navigation for Large-Area Augmented Reality

Inertial Measurement Units I!

Low Cost solution for Pose Estimation of Quadrotor

Satellite Attitude Determination

Quaternion Kalman Filter Design Based on MEMS Sensors

Computationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Camera Drones Lecture 2 Control and Sensors

navigation Isaac Skog

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

Satellite and Inertial Navigation and Positioning System

Selection and Integration of Sensors Alex Spitzer 11/23/14

GPS denied Navigation Solutions

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

An Intro to Gyros. FTC Team #6832. Science and Engineering Magnet - Dallas ISD

Autonomous Navigation for Flying Robots

EE565:Mobile Robotics Lecture 2

Lecture 13 Visual Inertial Fusion

TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU

Dynamic Modelling for MEMS-IMU/Magnetometer Integrated Attitude and Heading Reference System

OFERTA O120410PA CURRENT DATE 10/04//2012 VALID UNTIL 10/05/2012 SUMMIT XL

Indoor navigation using smartphones. Chris Hide IESSG, University of Nottingham, UK

Autonomous Navigation for Flying Robots

Real-Time Global Localization with A Pre-Built Visual Landmark Database

E80. Experimental Engineering. Lecture 9 Inertial Measurement

Sensor technology for mobile robots

EE 267 Virtual Reality Course Notes: 3-DOF Orientation Tracking with IMUs

Testing the Possibilities of Using IMUs with Different Types of Movements

MEMS technology quality requirements as applied to multibeam echosounder. Jerzy DEMKOWICZ, Krzysztof BIKONIS

Novel applications for miniature IMU s

UAV Autonomous Navigation in a GPS-limited Urban Environment

VINet: Visual-Inertial Odometry as a Sequence-to-Sequence Learning Problem

Mio- x AHRS. Attitude and Heading Reference System. Engineering Specifications

A Kalman Filter Based Attitude Heading Reference System Using a Low Cost Inertial Measurement Unit

3D Motion Tracking by Inertial and Magnetic sensors with or without GPS

EXPERIMENTAL COMPARISON BETWEEN MAHONEY AND COMPLEMENTARY SENSOR FUSION ALGORITHM FOR ATTITUDE DETERMINATION BY RAW SENSOR DATA OF XSENS IMU ON BUOY

A General Framework for Mobile Robot Pose Tracking and Multi Sensors Self-Calibration

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

Collaboration is encouraged among small groups (e.g., 2-3 students).

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

Satellite/Inertial Navigation and Positioning System (SINAPS)

Tap Position Inference on Smart Phones

VCIT Visually Corrected Inertial Tracking

INTEGRATED TECH FOR INDUSTRIAL POSITIONING

Optimization of Control Parameter for Filter Algorithms for Attitude and Heading Reference Systems

Review Paper: Inertial Measurement

Error Simulation and Multi-Sensor Data Fusion

Embedded Navigation Solutions VN-100 User Manual

Inertial Navigation Systems

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

Autonomous Mobile Robot Design

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

Inertial Measurement Units II!

LPMS-B Reference Manual

LPMS-B Reference Manual

Real-time Gesture Pattern Classification with IMU Data

GPS + Inertial Sensor Fusion

DriftLess Technology to improve inertial sensors

9 Degrees of Freedom Inertial Measurement Unit with AHRS [RKI-1430]

CS283: Robotics Fall 2016: Sensors

Introduction to Inertial Navigation (INS tutorial short)

Perspective Sensing for Inertial Stabilization

Using SensorTag as a Low-Cost Sensor Array for AutoCAD

MULTI-SENSOR DATA FUSION FOR LAND VEHICLE ATTITUDE ESTIMATION USING A FUZZY EXPERT SYSTEM

NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM

Me 3-Axis Accelerometer and Gyro Sensor

Package RAHRS. July 18, 2015

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

RESEARCH ON THE APPLICATION OF STABLE ATTITUDE ALGORITHM BASED ON DATA FUSION OF MULTI- DIMENSIONAL MEMS INERTIAL SENSORS

CHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS

(1) and s k ωk. p k vk q

Computer Animation and Visualisation. Lecture 3. Motion capture and physically-based animation of characters

INERTIAL NAVIGATION SYSTEM DEVELOPED FOR MEMS APPLICATIONS

Live Metric 3D Reconstruction on Mobile Phones ICCV 2013

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 COORDINATE TRANSFORMS. Prof. Steven Waslander

CHAPTER 2 SENSOR DATA SIMULATION: A KINEMATIC APPROACH

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

Project: UAV Payload Retract Mechanism. Company Background. Introduction

Keeping a Good Attitude: A Quaternion-Based Orientation Filter for IMUs and MARGs

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education

Lost! Leveraging the Crowd for Probabilistic Visual Self-Localization

Embedded Navigation Solutions. VN-100 User Manual. Firmware v Document Revision UM001 Introduction 1

Limits of Absolute Heading Accuracy Using Inexpensive MEMS Sensors

Video Georegistration: Key Challenges. Steve Blask Harris Corporation GCSD Melbourne, FL 32934

High-precision, consistent EKF-based visual-inertial odometry

COMBINING MEMS-BASED IMU DATA AND VISION-BASED TRAJECTORY ESTIMATION

Strapdown Inertial Navigation Technology, Second Edition D. H. Titterton J. L. Weston

CSE 527: Introduction to Computer Vision

ROBOT TEAMS CH 12. Experiments with Cooperative Aerial-Ground Robots

A Sensor Fusion Approach for Localization with Cumulative Error Elimination

Calibration of Inertial Measurement Units Using Pendulum Motion

Relating Local Vision Measurements to Global Navigation Satellite Systems Using Waypoint Based Maps

GEOG 4110/5100 Advanced Remote Sensing Lecture 4

Ceilbot vision and mapping system

ECGR4161/5196 Lecture 6 June 9, 2011

Sensor-fusion Demo Documentation

Fusing Sensors into Mobile Operating Systems & Innovative Use Cases

ADVANTAGES OF INS CONTROL SYSTEMS

Monocular Visual-Inertial SLAM. Shaojie Shen Assistant Professor, HKUST Director, HKUST-DJI Joint Innovation Laboratory

Transcription:

January 6, 2018 For First Robotics 2018 Camera and Inertial Sensor Fusion David Zhang david.chao.zhang@gmail.com Version 4.1 1

My Background Ph.D. of Physics - Penn State Univ. Research scientist at SRI International for 17 years on Computer Vision hardware, algorithm and system integration. Served as Co-Chair for Montgomery Science and Invention Convention for 4 years. Trustee of Montgomery Township Education Foundation. 1 st year mentor for Team 1403. 2

In Robotic applications, various sensors are needed: 1.Cameras to detect features, recognize background and objects 2.Range sensors to detect object 3D depth in the environment 3.Inertial sensors to detect local pose and position of the robot 4.Positioning sensors to locate robot in the global 3D environment 3

Cameras: Collect 2D images from 3D scene Detect local and global features Sense static and moving targets from x-y-t images. Identify, detect, and recognize background and objects. https://en.wikipedia.org/wiki/camera 4

Cameras: Visible, NIR, SWIR, MWIR, LWIR, UV Wavelength NIR 0.4 0.7 0.9 2 5 8 12 micrometers High frequency Low frequency https://www.techedu.com/thermal-imagers-for-electrical-hvac/flir-c2-edu/ 5

Cameras: SWIR example I look like a 70 year old man with gray hair 6

SRI International Proprietary NIR 0.4 0.7 0.9 2 5 8 12 micrometers https://www.techedu.com/thermal-imagers-for-electrical-hvac/flir-c2-edu/ Visible SWIR LWIR 7

EO-IR Fusion Example Visible SRI International Proprietary LWIR See Road Lines, See Car Blinkers, Read Signs EO-IR Fusion Night Vision 8

Visible Cameras Sensor Type: Photon to electron conversion, difference in charge transport CCD Charge-coupled device (less noise, less power) CMOS Complementary metal-oxide-semiconductor (less expensive) Shutter Mode: Global shutter (most CCD) scan the entire area of the image simultaneously Progressive scanning (Most CMOS) Scan line by line From top to bottom (most cell phones) Sensor Pattern: Bayer pattern G,R,B,G 9

Visible Cameras Progressive Scanning: Rolling Shutter Artifacts Example extracted from Why Do Cameras Do This? (Rolling Shutter Explained) - Smarter Every Day 172 https://www.youtube.com/watch?v=dnvtmmllnoe 10

Visible Cameras Sensors for Robots WFOV: Barrel Distortion Correction c c c c y r k r k y j i y j i y x r k r k x j i x j i x 4 2 2 1 4 2 2 1 1 1 ), (, ), (, ' ' https://www.mathworks.com/help/vision/single-camera-calibration.html r (x c,y c ) (x c, y c ): optical center r(i,j): distance from center to pixel(i,j) 11 (x,y) (x,y )

Fixed Pattern noise in SWIR, MWIR, LWIR Use internal shutter as equivalent external blackbody for nonuniformity correction: correct pixel offset (1-point correction) https://pdfs.semanticscholar.org/50d5/d81fc5e5fc1f2f8bad54150b2646ad28f0ba.pdf Use external heat sources for non-uniformity correction: correct pixel offset and gain (2-point correction) http://ieeexplore.ieee.org/document/5876937/ Scene-based non-uniformity correction based on local statistics https://www.osapublishing.org/josaa/abstract.cfm?uri=josaa-25-6-1444 http://ieeexplore.ieee.org/abstract/document/4107169/ 12

Inertial Sensors: Provide robot s acceleration, angular velocities, and heading information. Inertial Navigation System (INS) INS is based on measurements from IMU INS provides position, velocity and heading information of a robot. Estimate current pose and position by the previous known state. 13

Inertial Sensors Inertial Measurement Unit (IMU) Accelerometer(Acceleration, gravity) Gyroscope (3-D angular rotational velocity) Magnetometer (3-D earth magnetic fields) Micro-Electro-Mechanical Systems (MEMS) Cheap, low accuracy Laser Gyroscope Expensive, high accuracy VECTORNAV Invensense Xsens Honeywell SparkFun 9-DOF Adafruit 9-DOF 14

Inertial Sensors X Y Gyroscope measures angular velocity in degrees/sec Accelerometer measures linear acceleration in a m/s 2 Magnetometer measures magnetic field strength in mt (micro Tesla) or Gauss 1 Gauss = 100 mt Intrinsic rotation on robot: roll pitch yaw Yaw R R R R Z x x y y z z * 3-D Euler angle rotations are not commutative. * 4-D Quaternion is used in representing rotations Why do we need 3 devices? https://en.wikipedia.org/wiki/euler_angles#intrinsic_rotations 15

Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf 16

Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf DC drift noise 17

Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf 18

Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf 19

Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf 20

Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf Estimated orientation Orientation Gyro Measurements Measured gyro data bias True orientation Error Ground truth Starting angle Time in s Integration in time Time in s 21

Inertial Sensors Accelerometers Measure linear acceleration: ~ ( g ) ( l ) a a a 2, ~ N 0, acc http://stanford.edu/class/ee267/lectures/lecture9.pdf Angle from accelerometer: a ~ tan 1 a~ Problem: Noise x y IMU Y(world) Y(body) X(body) X(world) 22

Drift Inertial Sensors Angle measurement from Gyroscope and Accelerometer gyro integration via Taylor series as: t t t ~ gyro t gyro http://stanford.edu/class/ee267/lectures/lecture9.pdf Y(world) Angle from accelerometer: acc a ~ t tan 1 a~ x y IMU Y(body) X(world) Noise X(body) 23

Inertial Sensors Sensor Fusion: Combine Acc and Gyro measurements Remove drift from gyro via high pass filter Remove noise from accelerometer via low pass filter Simple weighted Average: ~ 1 a~ x t t t t 1 tan a~ y ACC + GYRO weight gyro Acc Can not fix drift problem in yaw direction 24

Inertial Sensors Sensor Fusion: Combining magnetometer Sensor fusion from accelerometer and gyro can not correct DC bias in YAW. Magnetometer provide reference direction from earth magnetic fields to correct DC bias in YAW. Unfortunately, inside a building, or next to metal materials, earth magnets are disrupted and thus would fail to correct bias in YAW. In robot, cameras are used as visual pose reference to correct bias in IMU. DCM or Kalman filter is used in sensor fusion. 25

Inertial Sensors Sensor Fusion Algorithms and Libraries: Madgwick and Mahony s DCM filter in Quaternion form http://x-io.co.uk/open-source-imu-and-ahrs-algorithms/ https://github.com/arduino-libraries/madgwickahrs Kalman Filter http://www.olliw.eu/2013/imu-data-fusing/ http://ieeexplore.ieee.org/document/4637877/ https://github.com/danicomo/9dof-orientation-estimation https://github.com/sunsided/frdm-kl25z-marg-fusion 26

Inertial Sensors and Cameras IMU coordinate conventions w.r.t. camera systems cable z x y NED(North-East-Down) System World orientation: Down z Camera Coordinate system (0,0) x ECEF: Earth-Center-Earth-Fix y 27

SRI International Proprietary Sensor Fusion and Autonomous Navigation of Robots Visual Inertial Odometry via error-state EKF IMU Mechanization Corrected Navigation Solution Error Estimates (Corrections to inertial system) Error-state Extended Kalman Filter Geo-Landmark Library GPS Digital Elevation Map cameras Gyro guided relative pose estimation Inlier feature track measurements Frame t-4 Frame t-2 Frame t 28

SRI International Proprietary Sensor Fusion and Autonomous Navigation of Robots 1. Two cameras (stereo) 2. IMU 3. PC/104 https://en.wikipedia.org/ wiki/pc/104 29

SRI International Proprietary Sensor Fusion and Autonomous Navigation of Robots Accumulated Point Cloud 30

Sensor Fusion and Autonomous Navigation of Robots Feature matching using OpenCV Library https://docs.opencv.org/3.3.0/dc/dc3/tutorial_py_matcher.html 31

SRI International Proprietary Videos Sensor Fusion and Autonomous Navigation of Robots Augmented Reality Demo - The inserted virtual objects are placed precisely and appear stable in the driver s view. - Virtual helicopters and real objects (pedestrians and the bush) correctly occlude each other based on estimated depth maps. 32

Selected References Multiple View Geometry in Computer Vision by Richard Hartley, Andrew Zisserman An Invitation to 3-D Vision by Yi Ma, Stefano Soatto, Jana Koesecka, A. Shankar Sastry Visual Odometry System Using Multiple Stereo Cameras and Inertial Measurement Unit by Taragay Oskiper, Zhiwei Zhu, Supun Samarasekera, Rakesh Kumar Ten-fold Improvement in Visual Odometry Using Landmark Matching by Zhiwei Zhu, Taragay Oskiper, Supun Samarasekera, Rakesh Kumar https://github.com/openslam/awesome-slam-list https://github.com/tzutalin/awesome-visual-slam 33