Ford Campus Vision and Laser Data Set
|
|
- Helena Edwards
- 6 years ago
- Views:
Transcription
1 1 Ford Campus Vision and Laser Data Set Gaurav Pandey*, Ryan Eustice* and James McBride** *University of Michigan, Ann Arbor, MI **Ford Motor Company Research, Dearborn, MI Abstract This paper describes a dataset collected by an autonomous ground vehicle testbed, based upon a modified Ford F-250 pickup truck (Fig 1). The vehicle is outfitted with, a professional (Applanix POS LV) and consumer (Xsens MTI-G) Inertial Measuring Unit (IMU); along with a 3D Velodyne lidar scanner, two push-broom forward looking Riegl lidars, and a Point Grey Ladybug3 omnidirectional camera system. Here we present the timeregistered data from these sensors mounted on the vehicle, collected while driving the vehicle around the Ford Research campus and downtown Dearborn, Michigan during August-November The vehicle path trajectory in these dataset contain several large and small-scale loop closures, which makes it useful for testing various state of the art computer vision and SLAM (Simultaneous Localization and Mapping) algorithms. be useful in generating rich textured 3D maps of the environment for navigation purposes. Here we present the dataset collected by this vehicle, while driving in and around the Ford research campus and downtown Dearborn in Michigan. The dataset includes various small and large loop closure events, ranging from feature rich downtown areas to featureless empty parking lots. The most significant aspect of this dataset is the precise co-registration of the laser data with the omnidirectional camera imagery data thereby adding visual information to the structure of the environment as obtained from the laser data. This fused vision data along with odometry information constitutes an appropriate framework for mobile robot platforms to enhance various state of the art computer vision and robotics algorithms. We hope that this dataset will be very useful to the robotics and vision community and will provide new research opportunities in terms of using the image and laser data together, along with the odometry information. Fig. 1. Test vehicle showing sensor arrangement I. INTRODUCTION University of Michigan and Ford Motor Company have been collaborating on autonomous ground vehicle research since the 2007 DARPA Urban Grand Challenge, and through this continued collaboration have developed an autonomous ground vehicle testbed based upon a modified Ford F-250 pickup truck. This vehicle was one of the finalists in the 2007 DARPA Urban Grand Challenge and showed capabilities to navigate in the mock urban environment [1], which included moving targets, intermittently blocked pathways, and regions of denied-global positioning system (GPS) reception. This motivated us to collect large scale visual and inertial data of some real world urban environments, which might II. SENSORS We used a modified Ford F-250 pickup truck as our base platform. Although, the large size of the vehicle might look like a hindrance in the urban driving environment, but it has proved useful because it allows strategic placement of different sensors. Moreover, the large space at the back of the truck was sufficient enough to install 8 quad-core processors along with the cooling mechanism. The vehicle was integrated with the following perception and navigation sensors: A. Perception Sensors 1) Velodyne HDL 64E lidar: The HDL-64E has two blocks of lasers each consisting of 32 laser diodes aligned vertically, resulting in an effective 26.8 degree vertical field of view. The entire unit can spin about its vertical axis at speeds up to 900 rpm (15 Hz) to provide a full 360 degree azimuthal field of view. The maximum range till the sensor can see is 120m and it captures about 1 million range points per second. We captured our dataset with
2 2 the laser spinning at 10 Hz. More details about the sensor can be obtained from [2]. 2) Pointgrey Ladybug3 omnidirectional camera: The Pointgrey Ladybug3 (LB3) is a high resolution omnidirectional camera system. It has six 2- Megapixel (1600x1200) cameras, with five CCDs positioned in a horizontal ring and one positioned vertically, that enable the system to collect video from more than 80% of the full sphere. The camera can capture images at multiple resolutions and multiple frame rates, it also supports hardware jpeg compression on the head. We collected our dataset at half resolution (i.e. 1600x600) and 8 fps in raw format(uncompressed). The details of the sensor are available from the manufacturers website [3]. 3) Riegl LMS-Q120 lidar: Two Riegl LMS Q120 LIDARs each having an 80-deg FOV with very fine (0.02 deg per step) resolution are installed at the front of the vehicle [4]. B. Navigation Sensors 1) Applanix POS-LV 420 INS with Trimble GPS: POS LV is a compact, fully integrated, turnkey position and orientation system, utilizing integrated inertial technology to generate stable, reliable and repeatable positioning solutions for land-based vehicle applications. It gives position, orientation, velocity, angular rate and accelerations at a rate of 100 Hz [5]. 2) Xsens MTiG: The MTi-G is a miniature size and low weight 6DOF Attitude and Heading Reference System (AHRS). The MTi-G contains accelerometers, gyroscopes, magnetometers in 3D, an integrated GPS receiver, a static pressure sensor and temperature sensor. Its internal low-power signal processor provides real time and drift-free 3D orientation as well as calibrated 3D acceleration, 3D rate of turn, 3D earth-magnetic field, 3D position and 3D velocity data at maximum update rate of 120 Hz [6] All the sensors are distributed across the 8 quad-core processors installed at the back of the truck, in order to minimize the latency in data capture. As sensor data is received by the host computers, it is timestamped using the host computer clock along with the timestamp associated with the hardware clock of the device. This sensor data available on the host computer is then packaged into a Lightweight Communication and Marshalling (LCM) [7] packet and is transmitted over the network using multicast UDP. This transmitted data is captured by a logging process, which listens for such packets from all the sensor drivers, and stores them on the disk in a single log file. The data recorded in this log file is timestamped again which allows a synchronous playback of the data later on. III. SENSOR CALIBRATION All the sensors (perceptual and navigation) are fixed to the vehicle and are related to each other by a fixed transformation. This transformation which allows the reprojection of any point from one coordinate frame to the other was calculated for each sensor. The coordinate frame of the two navigation sensors (Applanix and Mtig) coincide and is called the body frame of the vehicle and all other coordinate frames are defined with respect to the body frame. Assuming the body frame to be the fixed reference frame the relative transformation of the remaining sensors is calculated either manually or automatically. Here we use the Smith, Self and Cheeseman s [8] notations to represent the 6DOF pose of a sensor coordinate frame. The calibration procedure and the relative transformation between the different coordinate frames is described below. 1) Relative transformation between Velodyne laser scanner and body frame (X bl ): This transformation is obtained from the vehicle CAD model. We have a very accurate CAD model of the vehicle which allows us to precisely obtain the position and orientation of the velodyne laser scanner with respect to the fixed body frame. The transformation thus obtained from the CAD model is: X bl = [2.4m, 0.01m, 2.3m, 180 o, 0 o, o ]. 2) Relative transformation between Velodyne laser scanner and Ladybug3 camera head (X hl ): This transformation allows us to project any 3D point in laser reference frame into the camera head s frame and thereby into the corresponding camera image. The calibration procedure requires a checkerboard pattern to be viewed simultaneously from both the sensors. The 3D points lying on the checkerboard pattern in laser reference frame and the normal to the plane in camera reference frame, obtained from the image using Zhang s method, constrains the rigid body transformation between the laser and the camera reference frame. We can formulate this constrain as a non-linear optimization problem to solve for the optimum rigid body transformation. The details of the procedure can be found in [9]. The transformation obtained using the method described above is-: X hl = [0.34m, 0.01m, 0.42m, 0.02 o, 0.03 o, o ]. 3) Relative transformation between Ladybug 3 camera head and body frame (X bh ): Once we have X bl and X hl the transformation X bh can be calculated using Smith et al [8] compounding operation-: X bh = X bl ( X hl ) (1)
3 3 The transformation thus obtained is-: Xbh = [2.06m, 0m, 2.72m, 180o, 0.02o, 0.8o ]. Main Directory Pose Applanix.log SCANS IMAGES VELODYNE LCM LCM 00.log LCM 01.log Scan1.mat Pose Mtig.log velodyne data 00.pcap velodyne data 01.pcap Scan2.mat Timestamp.log Gps.log PARAM.mat Scan3.mat Cam 0 Cam 1 Cam 2 Cam 3 Cam 4 Fig. 3. The trajectory of the vehicle in one trial around downtown Dearborn. Here we have plotted the GPS data coming from the Trimble on the Google map. FULL Fig. 2. The directory structure containing the dataset. The rectangular blocks represent folders. IV. DATA C OLLECTION The data was collected around the Ford research campus area and downtown area in Dearborn, Michigan, which we will refer to as the test environment in the rest of the paper. It is an ideal dataset representing urban environment and will be very helpful to researchers working on autonomous navigation in an unstructured urban environment. We drove the test vehicle mounted with all the sensors around the test environment several times covering different areas. We call each data collection exercise to be a trial. In every trial we collected the data keeping in mind the requirements of state of the art SLAM algorithms, thereby covering several small and large loops in our dataset. A sample trajectory of the vehicle in one of the trials around downtown Dearborn is shown in Fig 3. The unprocessed data consists of three main files for each trial. One file contains the raw images of the omnidirectional camera, captured at 8fps and the other file contains the timestamps of each image measured in microseconds since 00:00:00 Jan 1, 1970 UTC. The third file contains the data coming from remaining sensors (perceptual/navigational). This file stores the data in a LCM log file format similar to that described in [10]. The unprocessed data consist of raw spherically distorted images from the omnidirectional camera and raw point cloud without accounting for the vehicle motion. Here we present a set of processed data with MATLAB scripts that allow easy access of the dataset to the users. The processed data is organized in folders and the directory structure is as shown in Fig 2. The main files and folders are described below: 1) LCM: This folder contains the LCM log file corresponding to each trial. Each LCM log file contains the raw 3D point cloud from the velodyne laser scanner, the lidar data from Reigal LMS-Q120 and the navigational data from the navigational sensors described in section II. We provide software to playback this log file and visualize the data in an interactive graphical user interface. 2) Timestamp.log: This file contains the unix timestamp, measured in microseconds since 00:00:00 Jan 1, 1970 UTC, of each image captured by the omnidirectional camera during one trial. 3) Pose-Applanix.log: This file contains the 6DOF pose of the vehicle along with the timestamp provided by the Applanix POS-LV 420 INS, during one trial. 4) Pose-Mtig.log: This file contains the 6DOF pose of the vehicle along with the timestamp provided by the Xens MTiG, during one trial. 5) Gps.log: This file contains the GPS data (latitude/longitude/attitude) of the vehicle provided by the Trimble GPS, during one trial. 6) IMAGES: This folder contains the undistorted images captured from the omnidirectional camera system during one trial. The folder is further divided into sub-folders containing images corresponding to individual camera of the omnidirectional camera system. This folder also contains a folder named FULL which contains the undistorted images stacked together in one file as shown in Fig??. 7) PARAM.mat: This contains the intrinsic and extrinsic parameters of the omnidirectional camera system. This is a (1 5) array of structures with
4 4 Fig. 4. The left panel shows the spherically distorted image obtained from the ladybug camera. the right panel shows the corresponding undistorted image obtained after applying the transformation provided by the manufacturers following fields: a) PARAM.K: This is the (3 3) matrix of the internal parameters of the camera. b) PARAM.R, PARAM.t: These are the (3 3) rotation matrix and (3 1) translation vector respectively which transforms the 3D point cloud from laser reference system to camera reference system. c) PARAM.MappingMatrix: This contains the mapping between the distorted and undistorted image pixels. This mapping is provided by the manufacturers of the Ladybug3 camera system and it corrects the spherical distortion in the image. A pair of distorted and undistorted images from the camera is shown in Fig 4. 8) SCANS: This folder contains 3D scans from the Velodyne laser scanner, motion compensated by the vehicle pose provided by Applanix POS-LV 420 INS. Each scan file in this folder is a MATLAB file (.mat) which can be easily loaded into the MATLAB workspace. The structure of individual scan once loaded in MATLAB is shown below: a) Scan.XYZ: is an array (3 N ) of motion compensated 3D point cloud represented in Velodyne s reference frame (described in section III). Here N is the number of points per scan which is typically 80,000-1,00,000. b) Scan.timestamp laser: is the unix timestamp measured in microseconds since 00:00:00 Jan 1, 1970 UTC for the scan captured by the Velodyne laser scanner. c) Scan.timestamp camera: is the unix timestamp measured in microseconds since 00:00:00 Jan 1, 1970 UTC for the closest image captured by the omnidirectional camera. d) Scan.image index: is the index of the image which is closest in time to this scan. e) Scan.X wv: is the 6DOF pose of the vehicle in world reference system when the scan was captured. f) Scan.Cam: is a (1 5) array of structures corresponding to each camera of the omnidirectional camera system. The format of this structure is given below: i) Scan.Cam.points index: is a (1 m) array of index of the 3D points in laser reference frame, in the field of view of the camera. ii) Scan.Cam.xyz: is a (3 m) array of 3D points (m < N ) in camera reference system, in the field of view of the camera. iii) Scan.Cam.pixels: This is a (2 m) array of pixel coordinates corresponding to the 3D points projected onto the camera. 9) VELODYNE: This folder has several.pcap files containing the raw 3D point cloud from velodyne laser scanner in a format which can be played back at desired frame rate using our playback tool. This allows the user to quickly browse through the data corresponding to a single trial. Fig. 5. The top panel is a perspective view of the Velodyne lidar range data, color-coded by height above the estimated ground plane. The bottom panel shows the above-ground-plane range data projected into the corresponding image from the Ladybug cameras. We have provided MATLAB scripts to load and visualize the dataset into MATLAB workspace. We have tried to keep the data format simple so that it can be easily used by other researchers in their works. We have also provided some visualization scripts (written in MATLAB) with this dataset that allow the reprojection of any scan on the corresponding omnidirectional
5 5 imagery as shown in Fig 5. We have also provided a C visualization tool that uses OpenGl to render the textured point cloud as shown in Fig 6. Fig. 6. Screenshot of the viewer showing the textured point cloud [4] RIEGL. Mobile scanning. [Online]. Available: [5] Applanix. Land, pos lv. [Online]. Available: [6] Xsens. Mti-g a gps aided mems based inertial measurement unit (imu) and static pressure sensor. [Online]. Available: [7] A. Huang, E. Olson, and D. Moore, Lightweight communications and marshalling, [Online]. Available: [8] R. Smith, M. Self, and P. Cheeseman, A stochastic map for uncertain spatial relationships, in Proceedings of the 4th international symposium on Robotics Research. Cambridge, MA, USA: MIT Press, 1988, pp [9] G. Pandey, J. McBride, S. Savarese, and R. Eustice, Extrinsic calibration of a 3d laser scanner and an omnidirectional camera, in 7th IFAC Symposium on Intelligent Autonomous Vehicles, 2010, accepted, To Appear. [10] A. S. Huang, M. Antone, E. Olson, L. Fletcher, D. Moore, S. Teller, and J. Leonard, A high-rate, heterogeneous data set from the darpa urban challenge, in International Journal of Robotics Research, 2010, under review. V. SUMMARY We have presented a time-registered vision and navigational dataset of unstructured outdoor environments. We believe that this dataset will be highly useful to the robotics community especially to those who are looking toward the problem of autonomous navigation of ground vehicles in an a-priori unknown environment. The dataset can be used as a bench mark for testing various state of the art computer vision and robotics algorithms like SLAM, ICP (Iterative Closest Point), 3D object detection / recognition etc. Thus its a useful dataset which not only allows the researchers to test the existing algorithms but also provides a new perspective to the problems, in terms of utilizing the image (2D) and laser (3D) data together. ACKNOWLEDGMENTS We are very thankful to the Ford Motor Company for taking up this research initiative and providing us with all the hardware needed for data collection. We are also thankful to Prof. Silvio Savarase of University of Michigan for his continuous feedback throughout the data collection process. REFERENCES [1] J. R. McBride, J. C. Ivan, D. S. Rhode, J. D. Rupp, M. Y. Rupp, J. D. Higgins, D. D. Turner, and R. M. Eustice, A perspective on emerging automotive safety applications, derived from lessons learned through participation in the darpa grand challenges, in Journal of Field Robotics, 25(10), 2008, pp [2] Velodyne, Velodyne s hdl 64e: A high definition lidar sensor for 3d applications, October [3] Pointgrey. (2009) Spherical vision products: Ladybug3. [Online]. Available:
Ford Campus Vision and Lidar Data Set
1 Ford Campus Vision and Lidar Data Set Gaurav Pandey, James R. McBride and Ryan M. Eustice University of Michigan, Ann Arbor, MI Ford Motor Company Research, Dearborn, MI Abstract This paper describes
More informationVisually Bootstrapped Generalized ICP
Visually Bootstrapped Generalized ICP Gaurav Pandey Department of Electrical Engineering & Computer Science University of Michigan, Ann Arbor, MI 809 Email: pgaurav@umich.edu Silvio Savarese Department
More informationMULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang
MULTI-MODAL MAPPING Robotics Day, 31 Mar 2017 Frank Mascarich, Shehryar Khattak, Tung Dang Application-Specific Sensors Cameras TOF Cameras PERCEPTION LiDAR IMU Localization Mapping Autonomy Robotic Perception
More informationA high-rate, heterogeneous data set from the DARPA Urban Challenge
A high-rate, heterogeneous data set from the DARPA Urban Challenge Albert S. Huang Matthew Antone Edwin Olson Luke Fletcher David Moore Seth Teller John Leonard MIT CSAIL BAE Systems University of Michigan
More informationSimultaneous local and global state estimation for robotic navigation
Simultaneous local and global state estimation for robotic navigation The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As
More informationThe Gravel Pit Lidar-Intensity Imagery Dataset
The Gravel Pit Lidar-Intensity Imagery Dataset Sean Anderson, Colin McManus, Hang Dong, Erik Beerepoot and Timothy D. Barfoot University of Toronto Institute for Aerospace Studies, University of Oxford
More informationCamera Drones Lecture 2 Control and Sensors
Camera Drones Lecture 2 Control and Sensors Ass.Prof. Friedrich Fraundorfer WS 2017 1 Outline Quadrotor control principles Sensors 2 Quadrotor control - Hovering Hovering means quadrotor needs to hold
More informationCalibration of a rotating multi-beam Lidar
The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Calibration of a rotating multi-beam Lidar Naveed Muhammad 1,2 and Simon Lacroix 1,2 Abstract
More informationSpatialFuser. Offers support for GNSS and INS navigation systems. Supports multiple remote sensing equipment configurations.
SPATIALFUSER SpatialFuser SpatialFuser is designed to georeference data acquired by our LiDAR mapping systems into common mapping formats such as LAS/LAZ or GPX. The main purpose of SpatialFuser is to
More informationOFERTA O120410PA CURRENT DATE 10/04//2012 VALID UNTIL 10/05/2012 SUMMIT XL
OFERTA O120410PA CURRENT DATE 10/04//2012 VALID UNTIL 10/05/2012 SUMMIT XL CLIENT CLIENT: Gaitech REPRESENTANT: Andrew Pether MAIL: andyroojp@hotmail.com PRODUCT Introduction The SUMMIT XL has skid-steering
More informationAutomotive Mapping, Localization, and Perception for Active Safety Applications
Automotive Mapping, Localization, and Perception for Active Safety Applications Ryan Eustice & James McBride Ford - University of Michigan Innovation Alliance 3rd Annual Focus on the Future Automotive
More information3D Point Cloud Processing
3D Point Cloud Processing The image depicts how our robot Irma3D sees itself in a mirror. The laser looking into itself creates distortions as well as changes in intensity that give the robot a single
More informationLOAM: LiDAR Odometry and Mapping in Real Time
LOAM: LiDAR Odometry and Mapping in Real Time Aayush Dwivedi (14006), Akshay Sharma (14062), Mandeep Singh (14363) Indian Institute of Technology Kanpur 1 Abstract This project deals with online simultaneous
More informationVehicle Localization. Hannah Rae Kerner 21 April 2015
Vehicle Localization Hannah Rae Kerner 21 April 2015 Spotted in Mtn View: Google Car Why precision localization? in order for a robot to follow a road, it needs to know where the road is to stay in a particular
More informationAutonomous Navigation for Flying Robots
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München Sensors IMUs (inertial measurement units) Accelerometers
More informationNonlinear State Estimation for Robotics and Computer Vision Applications: An Overview
Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview Arun Das 05/09/2017 Arun Das Waterloo Autonomous Vehicles Lab Introduction What s in a name? Arun Das Waterloo Autonomous
More informationDealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech
Dealing with Scale Stephan Weiss Computer Vision Group NASA-JPL / CalTech Stephan.Weiss@ieee.org (c) 2013. Government sponsorship acknowledged. Outline Why care about size? The IMU as scale provider: The
More informationSelection and Integration of Sensors Alex Spitzer 11/23/14
Selection and Integration of Sensors Alex Spitzer aes368@cornell.edu 11/23/14 Sensors Perception of the outside world Cameras, DVL, Sonar, Pressure Accelerometers, Gyroscopes, Magnetometers Position vs
More informationLarge-Scale. Point Cloud Processing Tutorial. Application: Mobile Mapping
Large-Scale 3D Point Cloud Processing Tutorial 2013 Application: Mobile Mapping The image depicts how our robot Irma3D sees itself in a mirror. The laser looking into itself creates distortions as well
More informationMulti Channel Generalized-ICP
Multi Channel Generalized-ICP James Servos and Steven L. Waslander University of Waterloo, Waterloo, ON, Canada, N2L 3G1 Abstract Current state of the art scan registration algorithms which use only positional
More informationComputationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor
Computationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor Chang Liu & Stephen D. Prior Faculty of Engineering and the Environment, University of Southampton,
More informationMEMS technology quality requirements as applied to multibeam echosounder. Jerzy DEMKOWICZ, Krzysztof BIKONIS
MEMS technology quality requirements as applied to multibeam echosounder Jerzy DEMKOWICZ, Krzysztof BIKONIS Gdansk University of Technology Gdansk, Narutowicza str. 11/12, Poland demjot@eti.pg.gda.pl Small,
More informationFAST REGISTRATION OF TERRESTRIAL LIDAR POINT CLOUD AND SEQUENCE IMAGES
FAST REGISTRATION OF TERRESTRIAL LIDAR POINT CLOUD AND SEQUENCE IMAGES Jie Shao a, Wuming Zhang a, Yaqiao Zhu b, Aojie Shen a a State Key Laboratory of Remote Sensing Science, Institute of Remote Sensing
More informationLost! Leveraging the Crowd for Probabilistic Visual Self-Localization
Lost! Leveraging the Crowd for Probabilistic Visual Self-Localization Marcus A. Brubaker (Toyota Technological Institute at Chicago) Andreas Geiger (Karlsruhe Institute of Technology & MPI Tübingen) Raquel
More information3D-2D Laser Range Finder calibration using a conic based geometry shape
3D-2D Laser Range Finder calibration using a conic based geometry shape Miguel Almeida 1, Paulo Dias 1, Miguel Oliveira 2, Vítor Santos 2 1 Dept. of Electronics, Telecom. and Informatics, IEETA, University
More informationTowards Optimal 3D Point Clouds
By Andreas Nüchter, Jan Elseberg and Dorit Borrmann, Germany feature Automation in 3D Mobile Laser Scanning Towards Optimal 3D Point Clouds Motivated by the increasing need for rapid characterisation of
More informationRobot Mapping. SLAM Front-Ends. Cyrill Stachniss. Partial image courtesy: Edwin Olson 1
Robot Mapping SLAM Front-Ends Cyrill Stachniss Partial image courtesy: Edwin Olson 1 Graph-Based SLAM Constraints connect the nodes through odometry and observations Robot pose Constraint 2 Graph-Based
More informationSynchronization aspects of sensor and data fusion in a research multi-sensor-system
Synchronization aspects of sensor and data fusion in a research multi-sensor-system MCG 2016, Vichy, France 5 th International Conference on Machine Control & Guidance October 5, 2016 Jens-André Paffenholz,
More informationAutonomous Navigation for Flying Robots
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 7.2: Visual Odometry Jürgen Sturm Technische Universität München Cascaded Control Robot Trajectory 0.1 Hz Visual
More informationSimultaneous Localization and Mapping (SLAM)
Simultaneous Localization and Mapping (SLAM) RSS Lecture 16 April 8, 2013 Prof. Teller Text: Siegwart and Nourbakhsh S. 5.8 SLAM Problem Statement Inputs: No external coordinate reference Time series of
More informationIndoor navigation using smartphones. Chris Hide IESSG, University of Nottingham, UK
Indoor navigation using smartphones Chris Hide IESSG, University of Nottingham, UK Overview Smartphones Available sensors Current positioning methods Positioning research at IESSG 1. Wi-Fi fingerprinting
More informationCamera and Inertial Sensor Fusion
January 6, 2018 For First Robotics 2018 Camera and Inertial Sensor Fusion David Zhang david.chao.zhang@gmail.com Version 4.1 1 My Background Ph.D. of Physics - Penn State Univ. Research scientist at SRI
More informationThe UTIAS multi-robot cooperative localization and mapping dataset
The UTIAS multi-robot cooperative localization and mapping dataset The International Journal of Robotics Research 30(8) 969 974 The Author(s) 2011 Reprints and permission: sagepub.co.uk/journalspermissions.nav
More informationRunway Centerline Deviation Estimation from Point Clouds using LiDAR imagery
Runway Centerline Deviation Estimation from Point Clouds using LiDAR imagery Seth Young 1, Charles Toth 2, Zoltan Koppanyi 2 1 Department of Civil, Environmental and Geodetic Engineering The Ohio State
More informationOPTIMIZING 3D SURFACE CHARACTERISTICS DATA COLLECTION BY RE-USING THE DATA FOR PROJECT LEVEL ROAD DESIGN
OPTIMIZING 3D SURFACE CHARACTERISTICS DATA COLLECTION BY RE-USING THE DATA FOR PROJECT LEVEL ROAD DESIGN Benoit Petitclerc, P.E. John Laurent, M. Sc Richard Habel, M. Sc., Pavemetrics Systems Inc., Canada
More informationStable Vision-Aided Navigation for Large-Area Augmented Reality
Stable Vision-Aided Navigation for Large-Area Augmented Reality Taragay Oskiper, Han-Pang Chiu, Zhiwei Zhu Supun Samarasekera, Rakesh Teddy Kumar Vision and Robotics Laboratory SRI-International Sarnoff,
More informationLecture 13 Visual Inertial Fusion
Lecture 13 Visual Inertial Fusion Davide Scaramuzza Course Evaluation Please fill the evaluation form you received by email! Provide feedback on Exercises: good and bad Course: good and bad How to improve
More informationSimultaneous Localization
Simultaneous Localization and Mapping (SLAM) RSS Technical Lecture 16 April 9, 2012 Prof. Teller Text: Siegwart and Nourbakhsh S. 5.8 Navigation Overview Where am I? Where am I going? Localization Assumed
More informationNavigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM
Glossary of Navigation Terms accelerometer. A device that senses inertial reaction to measure linear or angular acceleration. In its simplest form, it consists of a case-mounted spring and mass arrangement
More informationThe Málaga Urban Dataset: High-rate Stereo and Lidars in a realistic urban scenario
The Málaga Urban Dataset: High-rate Stereo and Lidars in a realistic urban scenario José-Luis Blanco-Claraco *, Francisco-Ángel Moreno-Dueñas** and Javier González-Jiménez ** * Department of Mechanical
More informationCamera Calibration for a Robust Omni-directional Photogrammetry System
Camera Calibration for a Robust Omni-directional Photogrammetry System Fuad Khan 1, Michael Chapman 2, Jonathan Li 3 1 Immersive Media Corporation Calgary, Alberta, Canada 2 Ryerson University Toronto,
More informationAmplifying Human Performance Through Advanced Technology
Amplifying Human Performance Through Advanced Technology Team Cybernet The Only Michigan Centric Team that Qualified for the Finals Over 10 Km segments traveled through obstacles, intersections, GPS denied
More informationAutomating Data Alignment from Multiple Collects Author: David Janssen Optech Incorporated,Senior Technical Engineer
Automating Data Alignment from Multiple Collects Author: David Janssen Optech Incorporated,Senior Technical Engineer Stand in Presenter: David Collison Optech Incorporated, Regional Sales Manager Introduction
More informationJames Van Rens CEO Riegl USA, Inc. Mining Industry and UAV s combined with LIDAR Commercial UAV Las Vegas October 2015 James Van Rens CEO Riegl USA
James Van Rens CEO Riegl USA, Inc. Mining Industry and UAV s combined with LIDAR Commercial UAV Las Vegas October 2015 James Van Rens CEO Riegl USA COST EFFECIENCY CONTINUUM LIDAR and IMU Partnership Technology
More informationDense Tracking and Mapping for Autonomous Quadrocopters. Jürgen Sturm
Computer Vision Group Prof. Daniel Cremers Dense Tracking and Mapping for Autonomous Quadrocopters Jürgen Sturm Joint work with Frank Steinbrücker, Jakob Engel, Christian Kerl, Erik Bylow, and Daniel Cremers
More information9 Degrees of Freedom Inertial Measurement Unit with AHRS [RKI-1430]
9 Degrees of Freedom Inertial Measurement Unit with AHRS [RKI-1430] Users Manual Robokits India info@robokits.co.in http://www.robokitsworld.com Page 1 This 9 Degrees of Freedom (DOF) Inertial Measurement
More informationWhere s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map
Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map Sebastian Scherer, Young-Woo Seo, and Prasanna Velagapudi October 16, 2007 Robotics Institute Carnegie
More informationEE565:Mobile Robotics Lecture 2
EE565:Mobile Robotics Lecture 2 Welcome Dr. Ing. Ahmad Kamal Nasir Organization Lab Course Lab grading policy (40%) Attendance = 10 % In-Lab tasks = 30 % Lab assignment + viva = 60 % Make a group Either
More informationEvaluating the Performance of a Vehicle Pose Measurement System
Evaluating the Performance of a Vehicle Pose Measurement System Harry Scott Sandor Szabo National Institute of Standards and Technology Abstract A method is presented for evaluating the performance of
More informationAutomatic Extrinsic Calibration of Vision and Lidar by Maximizing Mutual Information
Automatic Extrinsic Calibration of Vision and Lidar by Maximizing Mutual Information Gaurav Pandey Electrical Engineering: Systems, University of Michigan, Ann Arbor, Michigan 89 e-mail: pgaurav@umich.edu
More informationUAV Autonomous Navigation in a GPS-limited Urban Environment
UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight
More informationIP-S2 HD. High Definition 3D Mobile Mapping System
IP-S2 HD High Definition 3D Mobile Mapping System Integrated, turnkey solution Georeferenced, Time-Stamped, Point Clouds and Imagery High Density, Long Range LiDAR sensor for ultimate in visual detail
More informationTEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU
TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU Alison K. Brown, Ph.D.* NAVSYS Corporation, 1496 Woodcarver Road, Colorado Springs, CO 891 USA, e-mail: abrown@navsys.com Abstract
More informationGraph-based SLAM (Simultaneous Localization And Mapping) for Bridge Inspection Using UAV (Unmanned Aerial Vehicle)
Graph-based SLAM (Simultaneous Localization And Mapping) for Bridge Inspection Using UAV (Unmanned Aerial Vehicle) Taekjun Oh 1), Sungwook Jung 2), Seungwon Song 3), and Hyun Myung 4) 1), 2), 3), 4) Urban
More informationAn Overview of Applanix.
An Overview of Applanix The Company The Industry Leader in Developing Aided Inertial Technology Founded on Canadian Aerospace and Defense Industry Expertise Providing Precise Position and Orientation Systems
More informationMobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS
Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2
More informationROAD-SCANNER COMPACT APPLICATION FIELDS MAIN FEATURES
ROAD-SCANNER COMPACT Mobile Mapping System by GEXCEL & SITECO collaboration A smaller mobile system for asset management and cartography suited for ZOLLER & FRÖHLICH PROFILER 9012 laser scanner. 2 + 3
More information3D Simultaneous Localization and Mapping and Navigation Planning for Mobile Robots in Complex Environments
3D Simultaneous Localization and Mapping and Navigation Planning for Mobile Robots in Complex Environments Sven Behnke University of Bonn, Germany Computer Science Institute VI Autonomous Intelligent Systems
More informationPattern Recognition for Autonomous. Pattern Recognition for Autonomous. Driving. Freie Universität t Berlin. Raul Rojas
Pattern Recognition for Autonomous Pattern Recognition for Autonomous Driving Raul Rojas Freie Universität t Berlin FU Berlin Berlin 3d model from Berlin Partner Freie Universitaet Berlin Outline of the
More informationLecture: Autonomous micro aerial vehicles
Lecture: Autonomous micro aerial vehicles Friedrich Fraundorfer Remote Sensing Technology TU München 1/41 Autonomous operation@eth Zürich Start 2/41 Autonomous operation@eth Zürich 3/41 Outline MAV system
More informationUse of Image aided Navigation for UAV Navigation and Target Geolocation in Urban and GPS denied Environments
Use of Image aided Navigation for UAV Navigation and Target Geolocation in Urban and GPS denied Environments Precision Strike Technology Symposium Alison K. Brown, Ph.D. NAVSYS Corporation, Colorado Phone:
More informationSimplified Orientation Determination in Ski Jumping using Inertial Sensor Data
Simplified Orientation Determination in Ski Jumping using Inertial Sensor Data B.H. Groh 1, N. Weeger 1, F. Warschun 2, B.M. Eskofier 1 1 Digital Sports Group, Pattern Recognition Lab University of Erlangen-Nürnberg
More informationConstruction and Calibration of a Low-Cost 3D Laser Scanner with 360º Field of View for Mobile Robots
Construction and Calibration of a Low-Cost 3D Laser Scanner with 360º Field of View for Mobile Robots Jorge L. Martínez, Jesús Morales, Antonio, J. Reina, Anthony Mandow, Alejandro Pequeño-Boter*, and
More informationDRC A Multi-Camera System on PC-Cluster for Real-time 3-D Tracking. Viboon Sangveraphunsiri*, Kritsana Uttamang, and Pongsakon Pedpunsri
The 23 rd Conference of the Mechanical Engineering Network of Thailand November 4 7, 2009, Chiang Mai A Multi-Camera System on PC-Cluster for Real-time 3-D Tracking Viboon Sangveraphunsiri*, Kritsana Uttamang,
More informationLIDAR and Panoramic Camera Extrinsic Calibration Approach Using a Pattern Plane
LIDAR and Panoramic Camera Extrinsic Calibration Approach Using a Pattern Plane Angel-Iván García-Moreno, José-Joel Gonzalez-Barbosa, Francisco-Javier Ornelas-Rodriguez, Juan B. Hurtado-Ramos, and Marco-Neri
More informationMonocular Visual-Inertial SLAM. Shaojie Shen Assistant Professor, HKUST Director, HKUST-DJI Joint Innovation Laboratory
Monocular Visual-Inertial SLAM Shaojie Shen Assistant Professor, HKUST Director, HKUST-DJI Joint Innovation Laboratory Why Monocular? Minimum structural requirements Widely available sensors Applications:
More informationIwane Mobile Mapping System
Iwane Mobile Mapping System Geo-Imaging Mobile Mapping Solution Iwane Mobile Mapping System (IMMS) is high-efficient, easyto-use, end-to-end solution that provides tremendous flexibility in collecting,
More informationnavigation Isaac Skog
Foot-mounted zerovelocity aided inertial navigation Isaac Skog skog@kth.se Course Outline 1. Foot-mounted inertial navigation a. Basic idea b. Pros and cons 2. Inertial navigation a. The inertial sensors
More informationImage Augmented Laser Scan Matching for Indoor Dead Reckoning
Image Augmented Laser Scan Matching for Indoor Dead Reckoning Nikhil Naikal, John Kua, George Chen, and Avideh Zakhor Abstract Most existing approaches to indoor localization focus on using either cameras
More informationAutonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles
Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles Shaojie Shen Dept. of Electrical and Systems Engineering & GRASP Lab, University of Pennsylvania Committee: Daniel
More informationAutonomous Mobile Robot Design
Autonomous Mobile Robot Design Topic: EKF-based SLAM Dr. Kostas Alexis (CSE) These slides have partially relied on the course of C. Stachniss, Robot Mapping - WS 2013/14 Autonomous Robot Challenges Where
More informationA Repository Of Sensor Data For Autonomous Driving Research
A Repository Of Sensor Data For Autonomous Driving Research Michael Shneier, Tommy Chang, Tsai Hong, Gerry Cheok, Harry Scott, Steve Legowik, Alan Lytle National Institute of Standards and Technology 100
More informationFlexible Calibration of a Portable Structured Light System through Surface Plane
Vol. 34, No. 11 ACTA AUTOMATICA SINICA November, 2008 Flexible Calibration of a Portable Structured Light System through Surface Plane GAO Wei 1 WANG Liang 1 HU Zhan-Yi 1 Abstract For a portable structured
More informationA Collection of Outdoor Robotic Datasets with centimeter-accuracy Ground Truth
myjournal manuscript No. (will be inserted by the editor) A Collection of Outdoor Robotic Datasets with centimeter-accuracy Ground Truth Jose-Luis Blanco Francisco-Angel Moreno Javier Gonzalez Received:
More informationInertial Systems. Ekinox Series TACTICAL GRADE MEMS. Motion Sensing & Navigation IMU AHRS MRU INS VG
Ekinox Series TACTICAL GRADE MEMS Inertial Systems IMU AHRS MRU INS VG ITAR Free 0.05 RMS Motion Sensing & Navigation AEROSPACE GROUND MARINE Ekinox Series R&D specialists usually compromise between high
More informationSensory Augmentation for Increased Awareness of Driving Environment
Sensory Augmentation for Increased Awareness of Driving Environment Pranay Agrawal John M. Dolan Dec. 12, 2014 Technologies for Safe and Efficient Transportation (T-SET) UTC The Robotics Institute Carnegie
More informationDistributed Vision-Aided Cooperative Navigation Based on Three-View Geometry
Distributed Vision-Aided Cooperative Navigation Based on hree-view Geometry Vadim Indelman, Pini Gurfil Distributed Space Systems Lab, Aerospace Engineering, echnion Ehud Rivlin Computer Science, echnion
More informationHigh-precision, consistent EKF-based visual-inertial odometry
High-precision, consistent EKF-based visual-inertial odometry Mingyang Li and Anastasios I. Mourikis, IJRR 2013 Ao Li Introduction What is visual-inertial odometry (VIO)? The problem of motion tracking
More informationA Comparison of Laser Scanners for Mobile Mapping Applications
A Comparison of Laser Scanners for Mobile Mapping Applications Craig Glennie 1, Jerry Dueitt 2 1 Department of Civil & Environmental Engineering The University of Houston 3605 Cullen Boulevard, Room 2008
More informationVisual Localization in Fused Image and Laser Range Data
Visual Localization in Fused Image and Laser Range Data Nicholas Carlevaris-Bianco, Anush Mohan Dept. Electrical Eng. & Computer Science University of Michigan Ann Arbor, Michigan 48109 Email: {carlevar,anushm}@umich.edu
More informationFederica Zampa Sineco SpA V. le Isonzo, 14/1, Milan, 20135, Italy
LYNX MOBILE MAPPER TM : THE NEW SURVEY TECHNOLOGY Federica Zampa Sineco SpA V. le Isonzo, 14/1, Milan, 20135, Italy federica.zampa@sineco.co.it Dario Conforti Optech Incorporated 300 Interchange Way, Vaughan,
More informationAided-inertial for GPS-denied Navigation and Mapping
Aided-inertial for GPS-denied Navigation and Mapping Erik Lithopoulos Applanix Corporation 85 Leek Crescent, Richmond Ontario, Canada L4B 3B3 elithopoulos@applanix.com ABSTRACT This paper describes the
More informationEstimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter
Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter Przemys law G asior, Stanis law Gardecki, Jaros law Gośliński and Wojciech Giernacki Poznan University of
More informationSensor Integration and Image Georeferencing for Airborne 3D Mapping Applications
Sensor Integration and Image Georeferencing for Airborne 3D Mapping Applications By Sameh Nassar and Naser El-Sheimy University of Calgary, Canada Contents Background INS/GPS Integration & Direct Georeferencing
More informationA Whole New World of Mapping and Sensing: Uses from Asset to Management to In Vehicle Sensing for Collision Avoidance
A Whole New World of Mapping and Sensing: Uses from Asset to Management to In Vehicle Sensing for Collision Avoidance Charles Toth, Dorota A Grejner-Brzezinska, Carla Bailo and Joanna Pinkerton Satellite
More informationCreating Affordable and Reliable Autonomous Vehicle Systems
Creating Affordable and Reliable Autonomous Vehicle Systems Shaoshan Liu shaoshan.liu@perceptin.io Autonomous Driving Localization Most crucial task of autonomous driving Solutions: GNSS but withvariations,
More informationIP-S2 HD HD IP-S2. 3D Mobile Mapping System. 3D Mobile Mapping System
HD HD 3D Mobile Mapping System 3D Mobile Mapping System Capture Geo-referenced, Time-Stamped Point Clouds and Imagery 3D Scanning of Roadside Features 360º Camera for Spherical Image Capture Dual Frequency
More informationReal world data collecting tools. Company Introduction. C2L equipment.co,.ltd
Company Introduction C2L equipment.co,.ltd www.c2l-equipment.com Company Status Business Areas Company name C2L equipment.co,.ltd Address CEO Unit 603, Samwhan HIPEX B, Pangyo Station RD 230, Bundang,
More informationarxiv: v1 [cs.cv] 14 Apr 2018
LiDAR and Camera Calibration using Motion Estimated by Sensor Fusion Odometry Ryoichi Ishikawa 1, Takeshi Oishi 1 and Katsushi Ikeuchi 2 arxiv:184.5178v1 [cs.cv] 14 Apr 218 Abstract In this paper, we propose
More informationRobotics (Kinematics) Winter 1393 Bonab University
Robotics () Winter 1393 Bonab University : most basic study of how mechanical systems behave Introduction Need to understand the mechanical behavior for: Design Control Both: Manipulators, Mobile Robots
More informationTightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains
Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains PhD student: Jeff DELAUNE ONERA Director: Guy LE BESNERAIS ONERA Advisors: Jean-Loup FARGES Clément BOURDARIAS
More informationInertial Navigation Static Calibration
INTL JOURNAL OF ELECTRONICS AND TELECOMMUNICATIONS, 2018, VOL. 64, NO. 2, PP. 243 248 Manuscript received December 2, 2017; revised April, 2018. DOI: 10.24425/119518 Inertial Navigation Static Calibration
More informationTurning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018
Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018 Asaf Moses Systematics Ltd., Technical Product Manager aviasafm@systematics.co.il 1 Autonomous
More informationSatellite and Inertial Navigation and Positioning System
Satellite and Inertial Navigation and Positioning System Project Proposal By: Luke Pfister Dan Monroe Project Advisors: Dr. In Soo Ahn Dr. Yufeng Lu EE 451 Senior Capstone Project December 10, 2009 PROJECT
More informationUsing 3D Laser Range Data for SLAM in Outdoor Environments
Using 3D Laser Range Data for SLAM in Outdoor Environments Christian Brenneke, Oliver Wulf, Bernardo Wagner Institute for Systems Engineering, University of Hannover, Germany [brenneke, wulf, wagner]@rts.uni-hannover.de
More informationContinuous 3D Scan-Matching with a Spinning 2D Laser
Continuous 3D Scan-Matching with a Spinning 2D Laser Michael Bosse and Robert Zlot Abstract Scan-matching is a technique that can be used for building accurate maps and estimating vehicle motion by comparing
More informationOverview of the Trimble TX5 Laser Scanner
Overview of the Trimble TX5 Laser Scanner Trimble TX5 Revolutionary and versatile scanning solution Compact / Lightweight Efficient Economical Ease of Use Small and Compact Smallest and most compact 3D
More informationRigorous Scan Data Adjustment for kinematic LIDAR systems
Rigorous Scan Data Adjustment for kinematic LIDAR systems Paul Swatschina Riegl Laser Measurement Systems ELMF Amsterdam, The Netherlands 13 November 2013 www.riegl.com Contents why kinematic scan data
More informationDevelopment of a MEMs-Based IMU Unit
Development of a MEMs-Based IMU Unit Başaran Bahadır Koçer, Vasfi Emre Ömürlü, Erhan Akdoğan, Celâl Sami Tüfekçi Department of Mechatronics Engineering Yildiz Technical University Turkey, Istanbul Abstract
More informationSensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016
Sensor Fusion: Potential, Challenges and Applications Presented by KVH Industries and Geodetics, Inc. December 2016 1 KVH Industries Overview Innovative technology company 600 employees worldwide Focused
More information