January 6, 2018 For First Robotics 2018 Camera and Inertial Sensor Fusion David Zhang david.chao.zhang@gmail.com Version 4.1 1
My Background Ph.D. of Physics - Penn State Univ. Research scientist at SRI International for 17 years on Computer Vision hardware, algorithm and system integration. Served as Co-Chair for Montgomery Science and Invention Convention for 4 years. Trustee of Montgomery Township Education Foundation. 1 st year mentor for Team 1403. 2
In Robotic applications, various sensors are needed: 1.Cameras to detect features, recognize background and objects 2.Range sensors to detect object 3D depth in the environment 3.Inertial sensors to detect local pose and position of the robot 4.Positioning sensors to locate robot in the global 3D environment 3
Cameras: Collect 2D images from 3D scene Detect local and global features Sense static and moving targets from x-y-t images. Identify, detect, and recognize background and objects. https://en.wikipedia.org/wiki/camera 4
Cameras: Visible, NIR, SWIR, MWIR, LWIR, UV Wavelength NIR 0.4 0.7 0.9 2 5 8 12 micrometers High frequency Low frequency https://www.techedu.com/thermal-imagers-for-electrical-hvac/flir-c2-edu/ 5
Cameras: SWIR example I look like a 70 year old man with gray hair 6
SRI International Proprietary NIR 0.4 0.7 0.9 2 5 8 12 micrometers https://www.techedu.com/thermal-imagers-for-electrical-hvac/flir-c2-edu/ Visible SWIR LWIR 7
EO-IR Fusion Example Visible SRI International Proprietary LWIR See Road Lines, See Car Blinkers, Read Signs EO-IR Fusion Night Vision 8
Visible Cameras Sensor Type: Photon to electron conversion, difference in charge transport CCD Charge-coupled device (less noise, less power) CMOS Complementary metal-oxide-semiconductor (less expensive) Shutter Mode: Global shutter (most CCD) scan the entire area of the image simultaneously Progressive scanning (Most CMOS) Scan line by line From top to bottom (most cell phones) Sensor Pattern: Bayer pattern G,R,B,G 9
Visible Cameras Progressive Scanning: Rolling Shutter Artifacts Example extracted from Why Do Cameras Do This? (Rolling Shutter Explained) - Smarter Every Day 172 https://www.youtube.com/watch?v=dnvtmmllnoe 10
Visible Cameras Sensors for Robots WFOV: Barrel Distortion Correction c c c c y r k r k y j i y j i y x r k r k x j i x j i x 4 2 2 1 4 2 2 1 1 1 ), (, ), (, ' ' https://www.mathworks.com/help/vision/single-camera-calibration.html r (x c,y c ) (x c, y c ): optical center r(i,j): distance from center to pixel(i,j) 11 (x,y) (x,y )
Fixed Pattern noise in SWIR, MWIR, LWIR Use internal shutter as equivalent external blackbody for nonuniformity correction: correct pixel offset (1-point correction) https://pdfs.semanticscholar.org/50d5/d81fc5e5fc1f2f8bad54150b2646ad28f0ba.pdf Use external heat sources for non-uniformity correction: correct pixel offset and gain (2-point correction) http://ieeexplore.ieee.org/document/5876937/ Scene-based non-uniformity correction based on local statistics https://www.osapublishing.org/josaa/abstract.cfm?uri=josaa-25-6-1444 http://ieeexplore.ieee.org/abstract/document/4107169/ 12
Inertial Sensors: Provide robot s acceleration, angular velocities, and heading information. Inertial Navigation System (INS) INS is based on measurements from IMU INS provides position, velocity and heading information of a robot. Estimate current pose and position by the previous known state. 13
Inertial Sensors Inertial Measurement Unit (IMU) Accelerometer(Acceleration, gravity) Gyroscope (3-D angular rotational velocity) Magnetometer (3-D earth magnetic fields) Micro-Electro-Mechanical Systems (MEMS) Cheap, low accuracy Laser Gyroscope Expensive, high accuracy VECTORNAV Invensense Xsens Honeywell SparkFun 9-DOF Adafruit 9-DOF 14
Inertial Sensors X Y Gyroscope measures angular velocity in degrees/sec Accelerometer measures linear acceleration in a m/s 2 Magnetometer measures magnetic field strength in mt (micro Tesla) or Gauss 1 Gauss = 100 mt Intrinsic rotation on robot: roll pitch yaw Yaw R R R R Z x x y y z z * 3-D Euler angle rotations are not commutative. * 4-D Quaternion is used in representing rotations Why do we need 3 devices? https://en.wikipedia.org/wiki/euler_angles#intrinsic_rotations 15
Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf 16
Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf DC drift noise 17
Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf 18
Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf 19
Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf 20
Inertial Sensors http://stanford.edu/class/ee267/lectures/lecture9.pdf Estimated orientation Orientation Gyro Measurements Measured gyro data bias True orientation Error Ground truth Starting angle Time in s Integration in time Time in s 21
Inertial Sensors Accelerometers Measure linear acceleration: ~ ( g ) ( l ) a a a 2, ~ N 0, acc http://stanford.edu/class/ee267/lectures/lecture9.pdf Angle from accelerometer: a ~ tan 1 a~ Problem: Noise x y IMU Y(world) Y(body) X(body) X(world) 22
Drift Inertial Sensors Angle measurement from Gyroscope and Accelerometer gyro integration via Taylor series as: t t t ~ gyro t gyro http://stanford.edu/class/ee267/lectures/lecture9.pdf Y(world) Angle from accelerometer: acc a ~ t tan 1 a~ x y IMU Y(body) X(world) Noise X(body) 23
Inertial Sensors Sensor Fusion: Combine Acc and Gyro measurements Remove drift from gyro via high pass filter Remove noise from accelerometer via low pass filter Simple weighted Average: ~ 1 a~ x t t t t 1 tan a~ y ACC + GYRO weight gyro Acc Can not fix drift problem in yaw direction 24
Inertial Sensors Sensor Fusion: Combining magnetometer Sensor fusion from accelerometer and gyro can not correct DC bias in YAW. Magnetometer provide reference direction from earth magnetic fields to correct DC bias in YAW. Unfortunately, inside a building, or next to metal materials, earth magnets are disrupted and thus would fail to correct bias in YAW. In robot, cameras are used as visual pose reference to correct bias in IMU. DCM or Kalman filter is used in sensor fusion. 25
Inertial Sensors Sensor Fusion Algorithms and Libraries: Madgwick and Mahony s DCM filter in Quaternion form http://x-io.co.uk/open-source-imu-and-ahrs-algorithms/ https://github.com/arduino-libraries/madgwickahrs Kalman Filter http://www.olliw.eu/2013/imu-data-fusing/ http://ieeexplore.ieee.org/document/4637877/ https://github.com/danicomo/9dof-orientation-estimation https://github.com/sunsided/frdm-kl25z-marg-fusion 26
Inertial Sensors and Cameras IMU coordinate conventions w.r.t. camera systems cable z x y NED(North-East-Down) System World orientation: Down z Camera Coordinate system (0,0) x ECEF: Earth-Center-Earth-Fix y 27
SRI International Proprietary Sensor Fusion and Autonomous Navigation of Robots Visual Inertial Odometry via error-state EKF IMU Mechanization Corrected Navigation Solution Error Estimates (Corrections to inertial system) Error-state Extended Kalman Filter Geo-Landmark Library GPS Digital Elevation Map cameras Gyro guided relative pose estimation Inlier feature track measurements Frame t-4 Frame t-2 Frame t 28
SRI International Proprietary Sensor Fusion and Autonomous Navigation of Robots 1. Two cameras (stereo) 2. IMU 3. PC/104 https://en.wikipedia.org/ wiki/pc/104 29
SRI International Proprietary Sensor Fusion and Autonomous Navigation of Robots Accumulated Point Cloud 30
Sensor Fusion and Autonomous Navigation of Robots Feature matching using OpenCV Library https://docs.opencv.org/3.3.0/dc/dc3/tutorial_py_matcher.html 31
SRI International Proprietary Videos Sensor Fusion and Autonomous Navigation of Robots Augmented Reality Demo - The inserted virtual objects are placed precisely and appear stable in the driver s view. - Virtual helicopters and real objects (pedestrians and the bush) correctly occlude each other based on estimated depth maps. 32
Selected References Multiple View Geometry in Computer Vision by Richard Hartley, Andrew Zisserman An Invitation to 3-D Vision by Yi Ma, Stefano Soatto, Jana Koesecka, A. Shankar Sastry Visual Odometry System Using Multiple Stereo Cameras and Inertial Measurement Unit by Taragay Oskiper, Zhiwei Zhu, Supun Samarasekera, Rakesh Kumar Ten-fold Improvement in Visual Odometry Using Landmark Matching by Zhiwei Zhu, Taragay Oskiper, Supun Samarasekera, Rakesh Kumar https://github.com/openslam/awesome-slam-list https://github.com/tzutalin/awesome-visual-slam 33