Synchronization aspects of sensor and data fusion in a research multi-sensor-system

Similar documents
Quality assurance and calibration tasks in the scope of multi-sensor systems

Adaptive Extended Kalman Filter for Geo-Referencing of a TLS-based Multi-Sensor-System

Development of a Test Field for the Calibration and Evaluation of Kinematic Multi Sensor Systems

Development of a Portable Mobile Laser Scanning System with Special Focus on the System Calibration and Evaluation

On the Tracking of a Laser Scanner for Geo-Referencing Tasks by Means of Geodetic Sensors

Robust spatial approximation of laser scanner point clouds by means of free-form curve and surface approaches

H. Kutterer, H. Alkhatib, J.-A. Paffenholz, H. Vennegeerts Geodätisches Institut, Leibniz Universität Hannover

Analysis of Different Reference Plane Setups for the Calibration of a Mobile Laser Scanning System

3D Point Cloud Processing

E80. Experimental Engineering. Lecture 9 Inertial Measurement

GPS denied Navigation Solutions

Autonomous Mobile Robot Design

OFERTA O120410PA CURRENT DATE 10/04//2012 VALID UNTIL 10/05/2012 SUMMIT XL

High Frequency Terrestrial Laser Scans for Monitoring Kinematic Processes

Camera Drones Lecture 2 Control and Sensors

Geometric Accuracy Investigations of the Latest Terrestrial Laser Scanning Systems

Low Cost solution for Pose Estimation of Quadrotor

Selection and Integration of Sensors Alex Spitzer 11/23/14

Evaluating the Performance of a Vehicle Pose Measurement System

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

The Applanix Approach to GPS/INS Integration

Towards Optimal 3D Point Clouds

Investigating the Applicability of Standard Software Packages for Laser Scanner Based Deformation Analyses

Field-of-view dependent registration of point clouds and incremental segmentation of table-tops using time-offlight

Large-Scale. Point Cloud Processing Tutorial. Application: Mobile Mapping

EE631 Cooperating Autonomous Mobile Robots

GPS + Inertial Sensor Fusion

Autonomous Navigation for Flying Robots

Construction and Calibration of a Low-Cost 3D Laser Scanner with 360º Field of View for Mobile Robots

Stable Vision-Aided Navigation for Large-Area Augmented Reality

Rigorous Scan Data Adjustment for kinematic LIDAR systems

3D Laser Range Finder Topological sensor gathering spatial data from the surrounding environment

Camera and Inertial Sensor Fusion

A General Framework for Mobile Robot Pose Tracking and Multi Sensors Self-Calibration

SpatialFuser. Offers support for GNSS and INS navigation systems. Supports multiple remote sensing equipment configurations.

Robot Localization based on Geo-referenced Images and G raphic Methods

Error Simulation and Multi-Sensor Data Fusion

W4. Perception & Situation Awareness & Decision making

EE565:Mobile Robotics Lecture 3

LOAM: LiDAR Odometry and Mapping in Real Time

Positioning and Synchronization of Industrial Robots

POME A mobile camera system for accurate indoor pose

Data Association for SLAM

3D Laserscanner App for Indoor Measurements

A Comparison of Laser Scanners for Mobile Mapping Applications

Satellite Attitude Determination

Lecture: Autonomous micro aerial vehicles

Using Particle Image Velocimetry for Road Vehicle Tracking and Performance Monitoring. Samuel C. Kucera Jeremy S. Daily

Distributed Vision-Aided Cooperative Navigation Based on Three-View Geometry

Estimation of elevation dependent deformations of a parabolic reflector of a large radio telescope

Autonomous Navigation for Flying Robots

Motion estimation of unmanned marine vehicles Massimo Caccia

Final Exam Practice Fall Semester, 2012

Mixed-Reality for Intuitive Photo-Realistic 3D-Model Generation

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Runway Centerline Deviation Estimation from Point Clouds using LiDAR imagery

Incremental Real-time Bundle Adjustment for Multi-camera Systems with Points at Infinity

DESIGN OF AN INDOOR MAPPING SYSTEM USING THREE 2D LASER SCANNERS AND 6 DOF SLAM

Trimble Engineering & Construction Group, 5475 Kellenburger Road, Dayton, OH , USA

5MP Global Shutter. High Dynamic Range Global Shutter CMOS Sensor

Direct geo-referencing of a static terrestrial laser scanner

Vehicle Localization. Hannah Rae Kerner 21 April 2015

TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU

Chapters 1 9: Overview

Monocular Visual-Inertial SLAM. Shaojie Shen Assistant Professor, HKUST Director, HKUST-DJI Joint Innovation Laboratory

Roll angle in 6DOF tracking. CMSC Charlotte-Concord, July 2008

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains

Optimized Design of 3D Laser Triangulation Systems

AUTOMATED MEASUREMENT SYSTEM FOR CRANE RAIL GEOMETRY DETERMINATION

Freestyle 3D. Features and applications. Dr. Daniel Döring Team Lead Product Development Freestyle 1

3D Maps. Prof. Dr. Andreas Nüchter Jacobs University Bremen Campus Ring Bremen 1

Dynamic Sensor-based Path Planning and Hostile Target Detection with Mobile Ground Robots. Matt Epperson Dr. Timothy Chung

Aided-inertial for GPS-denied Navigation and Mapping

Spring 2016 :: :: Robot Autonomy :: Team 7 Motion Planning for Autonomous All-Terrain Vehicle

Pattern Recognition for Autonomous. Pattern Recognition for Autonomous. Driving. Freie Universität t Berlin. Raul Rojas

Egomotion Estimation by Point-Cloud Back-Mapping

Indoor navigation using smartphones. Chris Hide IESSG, University of Nottingham, UK

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences

Calibration of a rotating multi-beam Lidar

Autonomous Navigation for Flying Robots

Pi HAT Sensor Board TM

James Van Rens CEO Riegl USA, Inc. Mining Industry and UAV s combined with LIDAR Commercial UAV Las Vegas October 2015 James Van Rens CEO Riegl USA

Re: Developing Requirements for Mobile LiDAR Data (#1015)

Lost! Leveraging the Crowd for Probabilistic Visual Self-Localization

ifp Universität Stuttgart Performance of IGI AEROcontrol-IId GPS/Inertial System Final Report

Visual-Inertial Localization and Mapping for Robot Navigation

New Technologies For The Real 3D Reference Point Determination

EE565:Mobile Robotics Lecture 2

Localization Refinement for Indoor Modeling System Using Lasers and Images

NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM

Exterior Orientation Parameters

Simplified Orientation Determination in Ski Jumping using Inertial Sensor Data

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education

Pose Estimation and Control of Micro-Air Vehicles

New Sony DepthSense TM ToF Technology

UAV Autonomous Navigation in a GPS-limited Urban Environment

Personal Navigation and Indoor Mapping: Performance Characterization of Kinect Sensor-based Trajectory Recovery

Emerging Vision Technologies: Enabling a New Era of Intelligent Devices

Transcription:

Synchronization aspects of sensor and data fusion in a research multi-sensor-system MCG 2016, Vichy, France 5 th International Conference on Machine Control & Guidance October 5, 2016 Jens-André Paffenholz, Johannes Bureick, Dmitri Diener, Johannes Link Geodätisches Institut, Leibniz Universität Hannover

Motivation What is common for multi-sensor-systems (MSS)? Superior goal: Efficient data capturing of the environment Object capturing sensors: w.l.o.g. laser scanner Referencing sensors: 3D positioning and navigation sensors Use benefits of each enlisted sensor What is essential for the MSS? 1) Availability of a proper time reference for the acquired sensor data 2) Mutual spatial relation of each enlisted sensor 2

Selected (laser scanner based) multi-sensor-systems in the research community [Keller & Sternberg, 2013] [Paffenholz and Harmening, 2014] [Nüchter et al., 2013] [ARTIS, research project@gih, 2016] [Heinz et al., 2015] Remark: Commercial MSS are in the portfolio of nearly each manufacturer of geodetic equipment, e.g. Leica, Riegl, Trimble 3

Outline The research multi-sensor-system Synchronization (temporal referencing) Common time reference by means of radio communication and GPS time Spatial referencing (registration) 6 dof calibration for a common spatial reference Data fusion Kalman Filtering for trajectory estimation Conclusion & Outlook 4

The research multi-sensor-system (RMSS) Small-scale vehicle (LxWxH 74x25x25 cm) Accelerated by means of a stepper motor Driving autonomously on a rail in an indoor laboratory 5

RMSS Referencing sensors Absolute position information prisms tracked by a robotic total station or a laser tracker TinkerForge IMU Brick 2.0 Acceleration, Magnetic, Angular Velocity Resolution Heading, Roll, Pitch Resolution Quaternion Resolution Sampling Rate Weight Current Consumption 14 bit, 16 bit, 16 bit 0.0625 steps 16 bit 100 Hz 12 g 415mW (83mA @5V) [www.tinkerforge.com, 2016] 6

RMSS Object capturing sensors [www.robotshop.com, 2016] [www.conrad.de, 2016] Profile laser scanner Hokuyo UST-10LX Range 0.06 10 m Repeatability 30 mm Absolute uncertainty 40 mm Scan angle 270 Angle resolution 0.25 frequency 40 profiles/s Raspberry Pi camera module with wideangle lens Resolution Framerate 5 MP (2592 x 1944 px.) 15 fps Field of view 122 x 89.5 7

RMSS Control unit and coding Single-board computer of type Raspberry Pi (RPI) 2 Model B running Ubuntu 14.04 LTS (Trusty Tahr) [www.ros.org, 7/2016] 8

Synchronization (temporal referencing) Synchronization (temporal referencing) Common time scale for different data sources Latency time due to imperfect synchronization Problem with RPI Drift of built-in clock Solution External clock source Radio communication DCF-77 GPS time [blog.remibergsma.com] 9

measurement values measurement values measurement values Establishment of temporal reference Methods of synchronization sensor 1 a) Clock-controlled registration sensor 2 sensor 3 b) Event-driven registration trigger time sensor 1 sensor 2 sensor 3 trigger time c) Event-based registration sensor 1 sensor 2 sensor 3 trigger time 10

External clock sources Radio communication; DCF-77 GPS time introduced via GPIO pins of RPI 11

Spatial referencing (registration) Definition of RMSS coordinate system (body frame, BF) Sensor coordinate systems External tracking (prism) IMU Laser scanner System calibration to obtain the spatial reference Pose: position and orientation estimation of individual sensors in BF 12

Spatial referencing (registration) Approach for pose estimation of laser scanner in BF Laser scanner (LS) Reference sensor of superior accuracy (LT) Reference geometry (RG) Approach according to Strübing & Neumann (2013) 13

Pose estimation of laser scanner in BF Functional relation (Gauß-Helmert model) x BF y BF z BF = t x t y t z + R x ω R y φ R z κ x LS y LS z LS w = n x x BF +n y y BF +n z z BF d Input: Laser scanner measurements [x LS, y LS, z LS ] to the RG-planes as observations Estimated plane parameters n x, n y, n z, d of the RG-planes measured with the reference sensor (here laser tracker) Output: 6 dof: translations t x, t y, t z and rotations ω, φ, κ of the laser scanner 14

Spatial referencing (registration) Reference geometry (plane) configuration All parameter sensitive for light changes in the measured data Tricky for translation across scanning plane (parameter x) Spatial distribution of the planes Selected view of plane configuration 15

misclosure vector w [mm] Spatial referencing (registration) Results of pose estimation Misclosure vector of the adjustment (w) Numerical results for parameter Parameter x σ x x [mm] 710.52 0.85 y [mm] 96.79 0.23 z [mm] 63.73 0.16 ω [gon] -0.198 0.037 φ [gon] -1.798 0.086 κ [gon] +0.660 0.104 plane / point number [ ] 16

x (along track) [m] azimuthal orientation κ [gon] Data fusion of referencing sensors within a recursive state-space filter: Kalman filter State vector: Position, Orientation and Velocity of the RMSS Observation vector: IMU, tacheometer and Pseudo observations measured value predicted value filtered value κ derived from IMU κ supported by tacheometer position Pseudo observable - Tacheometer heading time (since start of RMSS) [s] y (across track) [m] 17

Resulting 3d point cloud of RMSS of indoor laboratory situation Colour scheme Z-value from blue to green Rail highlighted in red Deflected pixel area close to the rails in yellow 18

Conclusion & Outlook Conclusion Synchronization of different sensor data Spatial referencing: pose estimation of laser scanner Data fusion within Kalman filter Outlook Improvement of ROS-node implementation In-depth analysis of stratum-0 time server implementation on RPI 19

Thank you for your attention. Synchronization aspects of sensor and data fusion in a research multi-sensor-system Dr.-Ing. Jens-André Paffenholz, Johannes Bureick, M.Sc., Dmitri Diener, B.Sc., Johannes Link, B.Eng. Geodätisches Institut, Leibniz Universität Hannover paffenholz@gih.uni-hannover.de www.gih.uni-hannover.de